Tag Archives: Flexibility

#435731 Video Friday: NASA Is Sending This ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

MARSS 2019 – July 1-5, 2019 – Helsinki, Finland
ICRES 2019 – July 29-30, 2019 – London, UK
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, PA, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The big news today is that NASA is sending a robot to Saturn’s moon Titan. A flying robot. The Dragonfly mission will launch in 2026 and arrive in 2034, but you knew that already, because last January, we posted a detailed article about the concept from the Applied Physics Lab at Johns Hopkins University. And now it’s not a concept anymore, yay!

Again, read all the details plus an interview in 2018 article.

[ NASA ]

A robotic gripping arm that uses engineered bacteria to “taste” for a specific chemical has been developed by engineers at the University of California, Davis, and Carnegie Mellon University. The gripper is a proof-of-concept for biologically-based soft robotics.

The new device uses a biosensing module based on E. coli bacteria engineered to respond to the chemical IPTG by producing a fluorescent protein. The bacterial cells reside in wells with a flexible, porous membrane that allows chemicals to enter but keeps the cells inside. This biosensing module is built into the surface of a flexible gripper on a robotic arm, so the gripper can “taste” the environment through its fingers.

When IPTG crosses the membrane into the chamber, the cells fluoresce and electronic circuits inside the module detect the light. The electrical signal travels to the gripper’s control unit, which can decide whether to pick something up or release it.

[ UC Davis ]

The Toyota Research Institute (TRI) is taking on the hard problems in manipulation research toward making human-assist robots reliable and robust. Dr. Russ Tedrake, TRI Vice President of Robotics Research, explains how we are exploring the challenges and addressing the reliability gap by using a robot loading dishes in a dishwasher as an example task.

[ TRI ]

The Tactile Telerobot is the world’s first haptic telerobotic system that transmits realistic touch feedback to an operator located anywhere in the world. It is the product of joint collaboration between Shadow Robot Company, HaptX, and SynTouch. All Nippon Airways funded the project’s initial research and development.

What’s really unique about this is the HaptX tactile feedback system, which is something we’ve been following for several years now. It’s one of the most magical tech experiences I’ve ever had, and you can read about it here and here.

[ HaptX ]

Thanks Andrew!

I love how snake robots can emulate some of the fanciest moves of real snakes, and then also do bonkers things that real snakes never do.

[ Matsuno Lab ]

Here are a couple interesting videos from the Human-Robot Interaction Lab at Tufts.

A robot is instructed to perform an action and cannot do it due to lack of sensors. But when another robot is placed nearby, it can execute the instruction by tacitly tapping into the other robot’s mind and using that robot’s sensors for its own actions. Yes, it’s automatic, and yes, it’s the BORG!

Two Nao robots are instructed to perform a dance and are able to do it right after instruction. Moreover, they can switch roles immediately, and even a third different PR2 robot can perform the dance right away, demonstrating the ability of our DIARC architecture to learn quickly and share the knowledge with any type of robot running the architecture.

Compared to Nao, PR2 just sounds… depressed.

[ HRI Lab ]

This work explores the problem of robot tool construction – creating tools from parts available in the environment. We advance the state-of-the-art in robotic tool construction by introducing an approach that enables the robot to construct a wider range of tools with greater computational efficiency. Specifically, given an action that the robot wishes to accomplish and a set of building parts available to the robot, our approach reasons about the shape of the parts and potential ways of attaching them, generating a ranking of part combinations that the robot then uses to construct and test the target tool. We validate our approach on the construction of five tools using a physical 7-DOF robot arm.

[ RAIL Lab ] via [ RSS ]

We like Magazino’s approach to warehouse picking- constrain the problem to something you can reliably solve, like shoeboxes.

Magazino has announced a new pricing model for their robots. You pay 55k euros for the robot itself, and then after that, all you pay to keep the robot working is 6 cents per pick, so the robot is only costing you money for the work that it actually does.

[ Magazino ]

Thanks Florin!

Human-Robot Collaborations are happening across factories worldwide, yet very few are using it for smaller businesses, due to high costs or the difficulty of customization. Elephant Robotics, a new player from Shenzhen, the Silicon Valley of Asia, has set its sight on helping smaller businesses gain access to smart robotics. They created a Catbot (a collaborative robotic arm) that will offer high efficiency and flexibility to various industries.

The Catbot is set to help from education projects, photography, massaging, to being a personal barista or co-playing a table game. The customizations are endless. To increase the flexibility of usage, the Catbot is extremely easy to program from a high precision task up to covering hefty ground projects.

[ Elephant Robotics ]

Thanks Johnson!

Dronistics, an EPFL spin-off, has been testing out their enclosed delivery drone in the Dominican Republic through a partnership with WeRobotics.

[ WeRobotics ]

QTrobot is an expressive humanoid robot designed to help children with autism spectrum disorder and children with special educational needs in learning new skills. QTrobot uses simple and exaggerated facial expressions combined by interactive games and stories, to help children improve their emotional skills. QTrobot helps children to learn about and better understand the emotions and teach them strategies to handle their emotions more effectively.

[ LuxAI ]

Here’s a typical day in the life of a Tertill solar-powered autonomous weed-destroying robot.

$300, now shipping from Franklin Robotics.

[ Tertill ]

PAL Robotics is excited to announce a new TIAGo with two arms, TIAGo++! After carefully listening to the robotics community needs, we used TIAGo’s modularity to integrate two 7-DoF arms to our mobile manipulator. TIAGo++ can help you swiftly accomplish your research goals, opening endless possibilities in mobile manipulation.

[ PAL Robotics ]

Thanks Jack!

You’ve definitely already met the Cobalt security robot, but Toyota AI Ventures just threw a pile of money at them and would therefore like you to experience this re-introduction:

[ Cobalt Robotics ] via [ Toyota AI ]

ROSIE is a mobile manipulator kit from HEBI Robotics. And if you don’t like ROSIE, the modular nature of HEBI’s hardware means that you can take her apart and make something more interesting.

[ HEBI Robotics ]

Learn about Kawasaki Robotics’ second addition to their line of duAro dual-arm collaborative robots, duAro2. This model offers an extended vertical reach (550 mm) and an increased payload capacity (3 kg/arm).

[ Kawasaki Robotics ]

Drone Delivery Canada has partnered with Peel Region Paramedics to pilot its proprietary drone delivery platform to enable rapid first responder technology via drone with the goal to reduce response time and potentially save lives.

[ Drone Delivery Canada ]

In this week’s episode of Robots in Depth, Per speaks with Harri Ketamo, from Headai.

Harri Ketamo talks about AI and how he aims to mimic human decision making with algorithms. Harri has done a lot of AI for computer games to create opponents that are entertaining to play against. It is easy to develop a very bad or a very good opponent, but designing an opponent that behaves like a human, is entertaining to play against and that you can beat is quite hard. He talks about how AI in computer games is a very important story telling tool and an important part of making a game entertaining to play.

This work led him into other parts of the AI field. Harri thinks that we sometimes have a problem separating what is real from what is the type of story telling he knows from gaming AI. He calls for critical analysis of AI and says that data has to be used to verify AI decisions and results.

[ Robots in Depth ]

Thanks Per! Continue reading

Posted in Human Robots

#435658 Video Friday: A Two-Armed Robot That ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2019 – July 29-30, 2019 – London, U.K.
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
Let us know if you have suggestions for next week, and enjoy today’s videos.

I’m sure you’ve seen this video already because you read this blog every day, but if you somehow missed it because you were skiing across Antarctica (the only valid excuse we’re accepting today), here’s our video introducing HMI’s Aquanaut transforming robot submarine.

And after you recover from all that frostbite, make sure and read our in-depth feature article here.

[ Aquanaut ]

Last week we complained about not having seen a ballbot with a manipulator, so Roberto from CMU shared a new video of their ballbot, featuring a pair of 7-DoF arms.

We should learn more at Humanoids 2019.

[ CMU ]

Thanks Roberto!

The FAA is making it easier for recreational drone pilots to get near-realtime approval to fly in lightly controlled airspace.

[ LAANC ]

Self-reconfigurable modular robots are usually composed of multiple modules with uniform docking interfaces that can be transformed into different configurations by themselves. The reconfiguration planning problem is finding what sequence of reconfiguration actions are required for one arrangement of modules to transform into another. We present a novel reconfiguration planning algorithm for modular robots. The algorithm compares the initial configuration with the goal configuration efficiently. The reconfiguration actions can be executed in a distributed manner so that each module can efficiently finish its reconfiguration task which results in a global reconfiguration for the system. In the end, the algorithm is demonstrated on real modular robots and some example reconfiguration tasks are provided.

[ CKbot ]

A nice design of a gripper that uses a passive thumb of sorts to pick up flat objects from flat surfaces.

[ Paper ] via [ Laval University ]

I like this video of a palletizing robot from Kawasaki because in the background you can see a human doing the exact same job and obviously not enjoying it.

[ Kawasaki ]

This robot cleans and “brings joy and laughter.” What else do we need?

I do appreciate that all the robots are named Leo, and that they’re also all female.

[ LionsBot ]

This is less of a dishwashing robot and more of a dishsorting robot, but we’ll forgive it because it doesn’t drop a single dish.

[ TechMagic ]

Thanks Ryosuke!

A slight warning here that the robot in the following video (which costs something like $180,000) appears “naked” in some scenes, none of which are strictly objectionable, we hope.

Beautifully slim and delicate motion life-size motion figures are ideal avatars for expressing emotions to customers in various arts, content and businesses. We can provide a system that integrates not only motion figures but all moving devices.

[ Speecys ]

The best way to operate a Husky with a pair of manipulators on it is to become the robot.

[ UT Austin ]

The FlyJacket drone control system from EPFL has been upgraded so that it can yank you around a little bit.

In several fields of human-machine interaction, haptic guidance has proven to be an effective training tool for enhancing user performance. This work presents the results of psychophysical and motor learning studies that were carried out with human participant to assess the effect of cable-driven haptic guidance for a task involving aerial robotic teleoperation. The guidance system was integrated into an exosuit, called the FlyJacket, that was developed to control drones with torso movements. Results for the Just Noticeable Difference (JND) and from the Stevens Power Law suggest that the perception of force on the users’ torso scales linearly with the amplitude of the force exerted through the cables and the perceived force is close to the magnitude of the stimulus. Motor learning studies reveal that this form of haptic guidance improves user performance in training, but this improvement is not retained when participants are evaluated without guidance.

[ EPFL ]

The SAND Challenge is an opportunity for small businesses to compete in an autonomous unmanned aerial vehicle (UAV) competition to help NASA address safety-critical risks associated with flying UAVs in the national airspace. Set in a post-natural disaster scenario, SAND will push the envelope of aviation.

[ NASA ]

Legged robots have the potential to traverse diverse and rugged terrain. To find a safe and efficient navigation path and to carefully select individual footholds, it is useful to predict properties of the terrain ahead of the robot. In this work, we propose a method to collect data from robot-terrain interaction and associate it to images, to then train a neural network to predict terrain properties from images.

[ RSL ]

Misty wants to be your new receptionist.

[ Misty Robotics ]

For years, we’ve been pointing out that while new Roombas have lots of great features, older Roombas still do a totally decent job of cleaning your floors. This video is a performance comparison between the newest Roomba (the S9+) and the original 2002 Roomba (!), and the results will surprise you. Or maybe they won’t.

[ Vacuum Wars ]

Lex Fridman from MIT interviews Chris Urmson, who was involved in some of the earliest autonomous vehicle projects, Google’s original self-driving car among them, and is currently CEO of Aurora Innovation.

Chris Urmson was the CTO of the Google Self-Driving Car team, a key engineer and leader behind the Carnegie Mellon autonomous vehicle entries in the DARPA grand challenges and the winner of the DARPA urban challenge. Today he is the CEO of Aurora Innovation, an autonomous vehicle software company he started with Sterling Anderson, who was the former director of Tesla Autopilot, and Drew Bagnell, Uber’s former autonomy and perception lead.

[ AI Podcast ]

In this week’s episode of Robots in Depth, Per speaks with Lael Odhner from RightHand Robotics.

Lael Odhner is a co-founder of RightHand Robotics, that is developing a gripper based on the combination of control and soft, compliant parts to get better grasping of objects. Their work focuses on grasping and manipulating everyday human objects in everyday environments.This mimics how human hands combine control and flexibility to grasp objects with great dexterity.

The combination of control and compliance makes the RightHand robotics gripper very light-weight and affordable. The compliance makes it easier to grasp objects of unknown shape and differs from the way industrial robots usually grip. The compliance also helps in a more unstructured environment where contact with the object and its surroundings cannot be exactly predicted.

[ RightHand Robotics ] via [ Robots in Depth ] Continue reading

Posted in Human Robots

#435614 3 Easy Ways to Evaluate AI Claims

When every other tech startup claims to use artificial intelligence, it can be tough to figure out if an AI service or product works as advertised. In the midst of the AI “gold rush,” how can you separate the nuggets from the fool’s gold?

There’s no shortage of cautionary tales involving overhyped AI claims. And applying AI technologies to health care, education, and law enforcement mean that getting it wrong can have real consequences for society—not just for investors who bet on the wrong unicorn.

So IEEE Spectrum asked experts to share their tips for how to identify AI hype in press releases, news articles, research papers, and IPO filings.

“It can be tricky, because I think the people who are out there selling the AI hype—selling this AI snake oil—are getting more sophisticated over time,” says Tim Hwang, director of the Harvard-MIT Ethics and Governance of AI Initiative.

The term “AI” is perhaps most frequently used to describe machine learning algorithms (and deep learning algorithms, which require even less human guidance) that analyze huge amounts of data and make predictions based on patterns that humans might miss. These popular forms of AI are mostly suited to specialized tasks, such as automatically recognizing certain objects within photos. For that reason, they are sometimes described as “weak” or “narrow” AI.

Some researchers and thought leaders like to talk about the idea of “artificial general intelligence” or “strong AI” that has human-level capacity and flexibility to handle many diverse intellectual tasks. But for now, this type of AI remains firmly in the realm of science fiction and is far from being realized in the real world.

“AI has no well-defined meaning and many so-called AI companies are simply trying to take advantage of the buzz around that term,” says Arvind Narayanan, a computer scientist at Princeton University. “Companies have even been caught claiming to use AI when, in fact, the task is done by human workers.”

Here are three ways to recognize AI hype.

Look for Buzzwords
One red flag is what Hwang calls the “hype salad.” This means stringing together the term “AI” with many other tech buzzwords such as “blockchain” or “Internet of Things.” That doesn’t automatically disqualify the technology, but spotting a high volume of buzzwords in a post, pitch, or presentation should raise questions about what exactly the company or individual has developed.

Other experts agree that strings of buzzwords can be a red flag. That’s especially true if the buzzwords are never really explained in technical detail, and are simply tossed around as vague, poorly-defined terms, says Marzyeh Ghassemi, a computer scientist and biomedical engineer at the University of Toronto in Canada.

“I think that if it looks like a Google search—picture ‘interpretable blockchain AI deep learning medicine’—it's probably not high-quality work,” Ghassemi says.

Hwang also suggests mentally replacing all mentions of “AI” in an article with the term “magical fairy dust.” It’s a way of seeing whether an individual or organization is treating the technology like magic. If so—that’s another good reason to ask more questions about what exactly the AI technology involves.

And even the visual imagery used to illustrate AI claims can indicate that an individual or organization is overselling the technology.

“I think that a lot of the people who work on machine learning on a day-to-day basis are pretty humble about the technology, because they’re largely confronted with how frequently it just breaks and doesn't work,” Hwang says. “And so I think that if you see a company or someone representing AI as a Terminator head, or a big glowing HAL eye or something like that, I think it’s also worth asking some questions.”

Interrogate the Data

It can be hard to evaluate AI claims without any relevant expertise, says Ghassemi at the University of Toronto. Even experts need to know the technical details of the AI algorithm in question and have some access to the training data that shaped the AI model’s predictions. Still, savvy readers with some basic knowledge of applied statistics can search for red flags.

To start, readers can look for possible bias in training data based on small sample sizes or a skewed population that fails to reflect the broader population, Ghassemi says. After all, an AI model trained only on health data from white men would not necessarily achieve similar results for other populations of patients.

“For me, a red flag is not demonstrating deep knowledge of how your labels are defined.”
—Marzyeh Ghassemi, University of Toronto

How machine learning and deep learning models perform also depends on how well humans labeled the sample datasets use to train these programs. This task can be straightforward when labeling photos of cats versus dogs, but gets more complicated when assigning disease diagnoses to certain patient cases.

Medical experts frequently disagree with each other on diagnoses—which is why many patients seek a second opinion. Not surprisingly, this ambiguity can also affect the diagnostic labels that experts assign in training datasets. “For me, a red flag is not demonstrating deep knowledge of how your labels are defined,” Ghassemi says.

Such training data can also reflect the cultural stereotypes and biases of the humans who labeled the data, says Narayanan at Princeton University. Like Ghassemi, he recommends taking a hard look at exactly what the AI has learned: “A good way to start critically evaluating AI claims is by asking questions about the training data.”

Another red flag is presenting an AI system’s performance through a single accuracy figure without much explanation, Narayanan says. Claiming that an AI model achieves “99 percent” accuracy doesn’t mean much without knowing the baseline for comparison—such as whether other systems have already achieved 99 percent accuracy—or how well that accuracy holds up in situations beyond the training dataset.

Narayanan also emphasized the need to ask questions about an AI model’s false positive rate—the rate of making wrong predictions about the presence of a given condition. Even if the false positive rate of a hypothetical AI service is just one percent, that could have major consequences if that service ends up screening millions of people for cancer.

Readers can also consider whether using AI in a given situation offers any meaningful improvement compared to traditional statistical methods, says Clayton Aldern, a data scientist and journalist who serves as managing director for Caldern LLC. He gave the hypothetical example of a “super-duper-fancy deep learning model” that achieves a prediction accuracy of 89 percent, compared to a “little polynomial regression model” that achieves 86 percent on the same dataset.

“We're talking about a three-percentage-point increase on something that you learned about in Algebra 1,” Aldern says. “So is it worth the hype?”

Don’t Ignore the Drawbacks

The hype surrounding AI isn’t just about the technical merits of services and products driven by machine learning. Overblown claims about the beneficial impacts of AI technology—or vague promises to address ethical issues related to deploying it—should also raise red flags.

“If a company promises to use its tech ethically, it is important to question if its business model aligns with that promise,” Narayanan says. “Even if employees have noble intentions, it is unrealistic to expect the company as a whole to resist financial imperatives.”

One example might be a company with a business model that depends on leveraging customers’ personal data. Such companies “tend to make empty promises when it comes to privacy,” Narayanan says. And, if companies hire workers to produce training data, it’s also worth asking whether the companies treat those workers ethically.

The transparency—or lack thereof—about any AI claim can also be telling. A company or research group can minimize concerns by publishing technical claims in peer-reviewed journals or allowing credible third parties to evaluate their AI without giving away big intellectual property secrets, Narayanan says. Excessive secrecy is a big red flag.

With these strategies, you don’t need to be a computer engineer or data scientist to start thinking critically about AI claims. And, Narayanan says, the world needs many people from different backgrounds for societies to fully consider the real-world implications of AI.

Editor’s Note: The original version of this story misspelled Clayton Aldern’s last name as Alderton. Continue reading

Posted in Human Robots

#435167 A Closer Look at the Robots Helping Us ...

Buck Rogers had Twiki. Luke Skywalker palled around with C-3PO and R2-D2. And astronauts aboard the International Space Station (ISS) now have their own robotic companions in space—Astrobee.

A pair of the cube-shaped robots were launched to the ISS during an April re-supply mission and are currently being commissioned for use on the space station. The free-flying space robots, dubbed Bumble and Honey, are the latest generation of robotic machines to join the human crew on the ISS.

Exploration of the solar system and beyond will require autonomous machines that can assist humans with numerous tasks—or go where we cannot. NASA has said repeatedly that robots will be instrumental in future space missions to the moon, Mars, and even to the icy moon Europa.

The Astrobee robots will specifically test robotic capabilities in zero gravity, replacing the SPHERES (Synchronized Position Hold, Engage, Reorient, Experimental Satellite) robots that have been on the ISS for more than a decade to test various technologies ranging from communications to navigation.

The 18-sided robots, each about the size of a volleyball or an oversized Dungeons and Dragons die, use CO2-based cold-gas thrusters for movement and a series of ultrasonic beacons for orientation. The Astrobee robots, on the other hand, can propel themselves autonomously around the interior of the ISS using electric fans and six cameras.

The modular design of the Astrobee robots means they are highly plug-and-play, capable of being reconfigured with different hardware modules. The robots’ software is also open-source, encouraging scientists and programmers to develop and test new algorithms and features.

And, yes, the Astrobee robots will be busy as bees once they are fully commissioned this fall, with experiments planned to begin next year. Scientists hope to learn more about how robots can assist space crews and perform caretaking duties on spacecraft.

Robots Working Together
The Astrobee robots are expected to be joined by a familiar “face” on the ISS later this year—the humanoid robot Robonaut.

Robonaut, also known as R2, was the first US-built robot on the ISS. It joined the crew back in 2011 without legs, which were added in 2014. However, the installation never entirely worked, as R2 experienced power failures that eventually led to its return to Earth last year to fix the problem. If all goes as planned, the space station’s first humanoid robot will return to the ISS to lend a hand to the astronauts and the new robotic arrivals.

In particular, NASA is interested in how the two different robotic platforms can complement each other, with an eye toward outfitting the agency’s proposed lunar orbital space station with various robots that can supplement a human crew.

“We don’t have definite plans for what would happen on the Gateway yet, but there’s a general recognition that intra-vehicular robots are important for space stations,” Astrobee technical lead Trey Smith in the NASA Intelligent Robotics Group told IEEE Spectrum. “And so, it would not be surprising to see a mobile manipulator like Robonaut, and a free flyer like Astrobee, on the Gateway.”

While the focus on R2 has been to test its capabilities in zero gravity and to use it for mundane or dangerous tasks in space, the technology enabling the humanoid robot has proven to be equally useful on Earth.

For example, R2 has amazing dexterity for a robot, with sensors, actuators, and tendons comparable to the nerves, muscles, and tendons in a human hand. Based on that design, engineers are working on a robotic glove that can help factory workers, for instance, do their jobs better while reducing the risk of repetitive injuries. R2 has also inspired development of a robotic exoskeleton for both astronauts in space and paraplegics on Earth.

Working Hard on Soft Robotics
While innovative and technologically sophisticated, Astrobee and Robonaut are typical robots in that neither one would do well in a limbo contest. In other words, most robots are limited in their flexibility and agility based on current hardware and materials.

A subfield of robotics known as soft robotics involves developing robots with highly pliant materials that mimic biological organisms in how they move. Scientists at NASA’s Langley Research Center are investigating how soft robots could help with future space exploration.

Specifically, the researchers are looking at a series of properties to understand how actuators—components responsible for moving a robotic part, such as Robonaut’s hand—can be built and used in space.

The team first 3D prints a mold and then pours a flexible material like silicone into the mold. Air bladders or chambers in the actuator expand and compress using just air.

Some of the first applications of soft robotics sound more tool-like than R2-D2-like. For example, two soft robots could connect to produce a temporary shelter for astronauts on the moon or serve as an impromptu wind shield during one of Mars’ infamous dust storms.

The idea is to use soft robots in situations that are “dangerous, dirty, or dull,” according to Jack Fitzpatrick, a NASA intern working on the soft robotics project at Langley.

Working on Mars
Of course, space robots aren’t only designed to assist humans. In many instances, they are the only option to explore even relatively close celestial bodies like Mars. Four American-made robotic rovers have been used to investigate the fourth planet from the sun since 1997.

Opportunity is perhaps the most famous, covering about 25 miles of terrain across Mars over 15 years. A dust storm knocked it out of commission last year, with NASA officially ending the mission in February.

However, the biggest and baddest of the Mars rovers, Curiosity, is still crawling across the Martian surface, sending back valuable data since 2012. The car-size robot carries 17 cameras, a laser to vaporize rocks for study, and a drill to collect samples. It is on the hunt for signs of biological life.

The next year or two could see a virtual traffic jam of robots to Mars. NASA’s Mars 2020 Rover is next in line to visit the Red Planet, sporting scientific gadgets like an X-ray fluorescence spectrometer for chemical analyses and ground-penetrating radar to see below the Martian surface.

This diagram shows the instrument payload for the Mars 2020 mission. Image Credit: NASA.
Meanwhile, the Europeans have teamed with the Russians on a rover called Rosalind Franklin, named after a famed British chemist, that will drill down into the Martian ground for evidence of past or present life as soon as 2021.

The Chinese are also preparing to begin searching for life on Mars using robots as soon as next year, as part of the country’s Mars Global Remote Sensing Orbiter and Small Rover program. The mission is scheduled to be the first in a series of launches that would culminate with bringing samples back from Mars to Earth.

Perhaps there is no more famous utterance in the universe of science fiction as “to boldly go where no one has gone before.” However, the fact is that human exploration of the solar system and beyond will only be possible with robots of different sizes, shapes, and sophistication.

Image Credit: NASA. Continue reading

Posted in Human Robots

#435152 The Futuristic Tech Disrupting Real ...

In the wake of the housing market collapse of 2008, one entrepreneur decided to dive right into the failing real estate industry. But this time, he didn’t buy any real estate to begin with. Instead, Glenn Sanford decided to launch the first-ever cloud-based real estate brokerage, eXp Realty.

Contracting virtual platform VirBELA to build out the company’s mega-campus in VR, eXp Realty demonstrates the power of a dematerialized workspace, throwing out hefty overhead costs and fundamentally redefining what ‘real estate’ really means. Ten years later, eXp Realty has an army of 14,000 agents across all 50 US states, 3 Canadian provinces, and 400 MLS market areas… all without a single physical office.

But VR is just one of many exponential technologies converging to revolutionize real estate and construction. As floating cities and driverless cars spread out your living options, AI and VR are together cutting out the middleman.

Already, the global construction industry is projected to surpass $12.9 trillion in 2022, and the total value of the US housing market alone grew to $33.3 trillion last year. Both vital for our daily lives, these industries will continue to explode in value, posing countless possibilities for disruption.

In this blog, I’ll be discussing the following trends:

New prime real estate locations;
Disintermediation of the real estate broker and search;
Materials science and 3D printing in construction.

Let’s dive in!

Location Location Location
Until today, location has been the name of the game when it comes to hunting down the best real estate. But constraints on land often drive up costs while limiting options, and urbanization is only exacerbating the problem.

Beyond the world of virtual real estate, two primary mechanisms are driving the creation of new locations.

(1) Floating Cities

Offshore habitation hubs, floating cities have long been conceived as a solution to rising sea levels, skyrocketing urban populations, and threatened ecosystems. In success, they will soon unlock an abundance of prime real estate, whether for scenic living, commerce, education, or recreation.

One pioneering model is that of Oceanix City, designed by Danish architect Bjarke Ingels and a host of other domain experts. Intended to adapt organically over time, Oceanix would consist of a galaxy of mass-produced, hexagonal floating modules, built as satellite “cities” off coastal urban centers and sustained by renewable energies.

While individual 4.5-acre platforms would each sustain 300 people, these hexagonal modules are designed to link into 75-acre tessellations sustaining up to 10,000 residents. Each anchored to the ocean floor using biorock, Oceanix cities are slated to be closed-loop systems, as external resources are continuously supplied by automated drone networks.

Electric boats or flying cars might zoom you to work, city-embedded water capture technologies would provide your water, and while vertical and outdoor farming supply your family meal, share economies would dominate goods provision.

AERIAL: Located in calm, sheltered waters, near coastal megacities, OCEANIX City will be an adaptable, sustainable, scalable, and affordable solution for human life on the ocean. Image Credit: OCEANIX/BIG-Bjarke Ingels Group.
Joined by countless government officials whose islands risk submersion at the hands of sea level rise, the UN is now getting on board. And just this year, seasteading is exiting the realm of science fiction and testing practical waters.

As French Polynesia seeks out robust solutions to sea level rise, their government has now joined forces with the San Francisco-based Seasteading Institute. With a newly designated special economic zone and 100 acres of beachfront, this joint Floating Island Project could even see up to a dozen inhabitable structures by 2020. And what better to fund the $60 million project than the team’s upcoming ICO?

But aside from creating new locations, autonomous vehicles (AVs) and flying cars are turning previously low-demand land into the prime real estate of tomorrow.

(2) Autonomous Electric Vehicles and Flying Cars

Today, the value of a location is a function of its proximity to your workplace, your city’s central business district, the best schools, or your closest friends.

But what happens when driverless cars desensitize you to distance, or Hyperloop and flying cars decimate your commute time? Historically, every time new transit methods have hit the mainstream, tolerance for distance has opened up right alongside them, further catalyzing city spread.

And just as Hyperloop and the Boring Company aim to make your commute immaterial, autonomous vehicle (AV) ridesharing services will spread out cities in two ways: (1) by drastically reducing parking spaces needed (vertical parking decks = more prime real estate); and (2) by untethering you from the steering wheel. Want an extra two hours of sleep on the way to work? Schedule a sleeper AV and nap on your route to the office. Need a car-turned-mobile-office? No problem.

Meanwhile, aerial taxis (i.e. flying cars) will allow you to escape ground congestion entirely, delivering you from bedroom to boardroom at decimated time scales.

Already working with regulators, Uber Elevate has staked ambitious plans for its UberAIR airborne taxi project. By 2023, Uber anticipates rolling out flying drones in its two first pilot cities, Los Angeles and Dallas. Flying between rooftop skyports, drones would carry passengers at a height of 1,000 to 2,000 feet at speeds between 100 to 200 mph. And while costs per ride are anticipated to resemble those of an Uber Black based on mileage, prices are projected to soon drop to those of an UberX.

But the true economic feat boils down to this: if I were to commute 50 to 100 kilometers, I could get two or three times the house for the same price. (Not to mention the extra living space offered up by my now-unneeded garage.)

All of a sudden, virtual reality, broadband, AVs, or high-speed vehicles are going to change where we live and where we work. So rather than living in a crowded, dense urban core for access to jobs and entertainment, our future of personalized, autonomous, low-cost transport opens the luxury of rural areas to all without compromising the benefits of a short commute.

Once these drivers multiply your real estate options, how will you select your next home?

Disintermediation: Say Bye to Your Broker
In a future of continuous and personalized preference-tracking, why hire a human agent who knows less about your needs and desires than a personal AI?

Just as disintermediation is cutting out bankers and insurance agents, so too is it closing in on real estate brokers. Over the next decade, as AI becomes your agent, VR will serve as your medium.

To paint a more vivid picture of how this will look, over 98 percent of your home search will be conducted from the comfort of your couch through next-generation VR headgear.

Once you’ve verbalized your primary desires for home location, finishings, size, etc. to your personal AI, it will offer you top picks, tour-able 24/7, with optional assistance by a virtual guide and constantly updated data. As a seller, this means potential buyers from two miles, or two continents, away.

Throughout each immersive VR tour, advanced eye-tracking software and a permissioned machine learning algorithm follow your gaze, further learn your likes and dislikes, and intelligently recommend other homes or commercial residences to visit.

Curious as to what the living room might look like with a fresh coat of blue paint and a white carpet? No problem! VR programs will be able to modify rendered environments instantly, changing countless variables, from furniture materials to even the sun’s orientation. Keen to input your own furniture into a VR-rendered home? Advanced AIs could one day compile all your existing furniture, electronics, clothing, decorations, and even books, virtually organizing them across any accommodating new space.

As 3D scanning technologies make extraordinary headway, VR renditions will only grow cheaper and higher resolution. One company called Immersive Media (disclosure: I’m an investor and advisor) has a platform for 360-degree video capture and distribution, and is already exploring real estate 360-degree video.

Smaller firms like Studio 216, Vieweet, Arch Virtual, ArX Solutions, and Rubicon Media can similarly capture and render models of various properties for clients and investors to view and explore. In essence, VR real estate platforms will allow you to explore any home for sale, do the remodel, and determine if it truly is the house of your dreams.

Once you’re ready to make a bid, your AI will even help estimate a bid, process and submit your offer. Real estate companies like Zillow, Trulia, Move, Redfin, ZipRealty (acquired by Realogy in 2014) and many others have already invested millions in machine learning applications to make search, valuation, consulting, and property management easier, faster, and much more accurate.

But what happens if the home you desire most means starting from scratch with new construction?

New Methods and Materials for Construction
For thousands of years, we’ve been constrained by the construction materials of nature. We built bricks from naturally abundant clay and shale, used tree limbs as our rooftops and beams, and mastered incredible structures in ancient Rome with the use of cement.

But construction is now on the cusp of a materials science revolution. Today, I’d like to focus on three key materials:

Upcycled Materials

Imagine if you could turn the world’s greatest waste products into their most essential building blocks. Thanks to UCLA researchers at CO2NCRETE, we can already do this with carbon emissions.

Today, concrete produces about five percent of all greenhouse gas (GHG) emissions. But what if concrete could instead conserve greenhouse emissions? CO2NCRETE engineers capture carbon from smokestacks and combine it with lime to create a new type of cement. The lab’s 3D printers then shape the upcycled concrete to build entirely new structures. Once conquered at scale, upcycled concrete will turn a former polluter into a future conserver.

Or what if we wanted to print new residences from local soil at hand? Marking an extraordinary convergence between robotics and 3D printing, the Institute of Advanced Architecture of Catalonia (IAAC) is already working on a solution.

In a major feat for low-cost construction in remote zones, IAAC has found a way to convert almost any soil into a building material with three times the tensile strength of industrial clay. Offering myriad benefits, including natural insulation, low GHG emissions, fire protection, air circulation, and thermal mediation, IAAC’s new 3D printed native soil can build houses on-site for as little as $1,000.

Nanomaterials

Nano- and micro-materials are ushering in a new era of smart, super-strong, and self-charging buildings. While carbon nanotubes dramatically increase the strength-to-weight ratio of skyscrapers, revolutionizing their structural flexibility, nanomaterials don’t stop here.

Several research teams are pioneering silicon nanoparticles to capture everyday light flowing through our windows. Little solar cells at the edges of windows then harvest this energy for ready use. Researchers at the US National Renewable Energy Lab have developed similar smart windows. Turning into solar panels when bathed in sunlight, these thermochromic windows will power our buildings, changing color as they do.

Self-Healing Infrastructure

The American Society of Civil Engineers estimates that the US needs to spend roughly $4.5 trillion to fix nationwide roads, bridges, dams, and common infrastructure by 2025. But what if infrastructure could fix itself?

Enter self-healing concrete. Engineers at Delft University have developed bio-concrete that can repair its own cracks. As head researcher Henk Jonkers explains, “What makes this limestone-producing bacteria so special is that they are able to survive in concrete for more than 200 years and come into play when the concrete is damaged. […] If cracks appear as a result of pressure on the concrete, the concrete will heal these cracks itself.”

But bio-concrete is only the beginning of self-healing technologies. As futurist architecture firms start printing plastic and carbon-fiber houses like the stunner seen below (using Branch Technologies’ 3D printing technology), engineers have begun tackling self-healing plastic.

And in a bid to go smart, burgeoning construction projects have started embedding sensors for preemptive detection. Beyond materials and sensors, however, construction methods are fast colliding into robotics and 3D printing.

While some startups and research institutes have leveraged robot swarm construction (namely, Harvard’s robotic termite-like swarm of programmed constructors), others have taken to large-scale autonomous robots.

One such example involves Fastbrick Robotics. After multiple iterations, the company’s Hadrian X end-to-end bricklaying robot can now autonomously build a fully livable, 180-square meter home in under 3 days. Using a laser-guided robotic attachment, the all-in-one brick-loaded truck simply drives to a construction site and directs blocks through its robotic arm in accordance with a 3D model.

Layhead. Image Credit: Fastbrick Robotics.
Meeting verified building standards, Hadrian and similar solutions hold massive promise in the long term, deployable across post-conflict refugee sites and regions recovering from natural catastrophes.

Imagine the implications. Eliminating human safety concerns and unlocking any environment, autonomous builder robots could collaboratively build massive structures in space or deep underwater habitats.

Final Thoughts
Where, how, and what we live in form a vital pillar of our everyday lives. The concept of “home” is unlikely to disappear anytime soon. At the same time, real estate and construction are two of the biggest playgrounds for technological convergence, each on the verge of revolutionary disruption.

As underlying shifts in transportation, land reclamation, and the definition of “space” (real vs. virtual) take hold, the real estate market is about to explode in value, spreading out urban centers on unprecedented scales and unlocking vast new prime “property.”

Meanwhile, converging advancements in AI and VR are fundamentally disrupting the way we design, build, and explore new residences. Just as mirror worlds create immersive, virtual real estate economies, VR tours and AI agents are absorbing both sides of the coin to entirely obliterate the middleman.

And as materials science breakthroughs meet new modes of construction, the only limits to tomorrow’s structures are those of our own imagination.

Join Me
Abundance-Digital Online Community: Stay ahead of technological advancements and turn your passion into action. Abundance Digital is now part of Singularity University. Learn more.

Image Credit: OCEANIX/BIG-Bjarke Ingels Group. Continue reading

Posted in Human Robots