Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#439576 Video Friday: Robot Opera

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RO-MAN 2021 – August 8-12, 2021 – [Online Event]DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USAWeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USAIROS 2021 – September 27-1, 2021 – [Online Event]ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USALet us know if you have suggestions for next week, and enjoy today's videos.
The New National Theatre Tokyo presents Super Angel: “Witness the birth of a new opera, performed by Alter 3, an android with artificial life who makes friends with children in the chorus as they sing and perform together.”

Alter 3 is characterized by its body, in which all interior mechanisms are exposed, and a face from which it is impossible to determine gender or age, and it is an android robot designed to feel life, which is unprecedented in the field. Researchers from Osaka University and the University of Tokyo, which are famous for their work into androids and artificial life, have been collaborating up until now to create and study two Alter androids. The main challenges of this are whether or not it is possible for robots to acquire a sense of life independently through interactivity with the outside world, and to answer the basic question of exactly what life is through the course of this. [ NNTT ] via [ Robotstart ]
Running the bases at Dodger Stadium is a fun tradition that many children look forward to after most Sunday games. But not all children, especially those who are currently hospitalized or recovering from an illness at home, can physically experience it. That's why UCLA Health, the Dodgers and OhmniLabs teamed up to create a virtual run-the-bases experience for 10 pediatric patients at UCLA Mattel Children's Hospital.
[ UCLA ]
Thanks, Joseph!
The way to teach robots to move like animals is to collect data from animals, and it's surprising how much of a difference some little tweaks can make to a quadrupedal gait.

[ ETHZ CRL ]
Thanks, Fan!
Walker X had me at back massage.

[ Ubtech ]
I needed this video today.

[ Soft Robotics ]
MIT faculty and staff reimagine an iconic mechanical engineering class – 2.007 (Design and Manufacturing I) – so students can go head-to-head in the final robot competition from their dorm rooms, apartments, or homes across the country.
The full competition livestreams are at the link below.
[ MIT 2.007 ]
The world's best female flair bartender vs the most advanced bartending robot. Who's gonna win?
Kuka's last human versus robot challenge involving table tennis was a huge disappointment, so I really hope this one is better.
[ Makr Shakr ]
I know software compliance is all the rage, but there's still something to be said for robot arms that are inherently soft.

[ Motion Intelligent Lab ]
Thanks, Fan!
We present a versatile, adhesive, and soft material (called VENOM) with high dynamic friction and normal adhesion forces on various smooth and rough surfaces. VENOM is a dry adhesive material based on a simple mixture of super-soft, fast cure platinum-catalyzed silicone and iron powder. Our result demonstrates the use of VENOM for the feet of our sprawling posture robot.
[ Paper ]
Thanks, Poramate!
Hybrid security by humans and robots. Communication with humans is handled by security guards through the avatar security robot Ugo, and hybrid security takes advantage of the characteristics of each robot and security guard.
What's the head on the stick at the end? I want one of those!
[ Ugo ]
Check out more views of the MQ-25 T1 test asset's historic flight, when it became the first unmanned aircraft to ever refuel another aircraft—piloted or autonomous—during flight. During a June 2021 flight test, the MQ25 T1 test asset transferred fuel to an F/A-18 Super Hornet.
[ Boeing ]
It's definitely cool to be able to do this with a robot, but it really makes you realize how effortless these tasks are for humans, right?

[ Extend Robotics ]
GE Research's Robotics and Autonomy team, led by Senior Robotics Scientist, Shiraj Sen, successfully completed Year 1 of a project with the US Army through its Scalable Adaptive Resilient Autonomy Program (SARA) to develop and demonstrate a risk-aware autonomous ground vehicle that was capable of navigating safely in complex off-road test conditions.
[ GE Research ]
Here's one way to add some safety to your industrial robot, I guess?

[ Kuka ]
Okay but seriously how is a kitchen “fully robotic” if you have to do all the prep and cleaning?

Also you left all the good stuff in the pot.
[ Moley ]
Here are a couple of videos showing some recent research from the Brussels Human Robotics Research Center (BruBotics); check the YouTube descriptions for paper references.

[ BruBotics ]

Thanks, Bram!
A Michigan Robotics Colloquium, hosted by the Robotics Graduate Student Council (RGSC), was held on July 27, 2021 about assistive technologies.
[ Michigan Robotics ] Continue reading

Posted in Human Robots

#439574 A theoretical approach for designing a ...

Swarm robotics is a relatively new and highly promising research field, which entails the development of multi-robot teams that can move and complete tasks together. Robot swarms could have numerous valuable applications. For instance, they could support humans during search and rescue missions or allow them to monitor geographical areas that are difficult to access. Continue reading

Posted in Human Robots

#439568 Corvus Robotics’ Autonomous Drones ...

Warehouses offer all kinds of opportunities for robots. Semi-structured controlled environments, lots of repetitive tasks, and humans that would almost universally rather be somewhere else. Robots have been doing great at taking over jobs that involve moving stuff from one place to another, but there are all kinds of other things that have to happen to keep warehouses operating efficiently.

Corvus Robotics, a YC-backed startup that's just coming out of stealth, has decided that they want to go after warehouse inventory tracking. That is, making sure that a warehouse knows exactly what's inside of it and where. This is a more complicated task than it seems like it should be, and not just any robot is able to do it. Corvus' solution involves autonomous drones that can fly unattended for weeks on end, collecting inventory data without any human intervention at all.

Many warehouses have a dedicated team of humans whose job is to wander around the warehouse scanning stuff to maintain an up to date list of where everything is, a task which is both very important and very boring. As it turns out, autonomous drones can scan up to ten times faster than humans—Corvus Robotics' drones are able to inventory an entire warehouse on a rolling basis in just a couple days, while it would take a human team weeks to do the same task.

Inventory is a significant opportunity for robotics, and we've seen a bunch of different attempts at doing inventory in places like supermarkets, but warehouses are different. Warehouses can be huge, in every dimension, meaning that the kinds of robots that can make supermarket inventory work just won't cut it in a warehouse environment for the simple reason that they can't see inventory stacked on shelves all the way to the ceiling, which can be over 20m high. And this is why the drone form factor, while novel, actually offers a uniquely useful solution.
It's probably fair to think of a warehouse as a semi-structured environment, with emphasis on the “semi.” At the beginning of a deployment, Corvus will generate one map of the operating area that includes both geometric and semantic information. After that, the drones will autonomously update that map with each flight throughout their entire lifetimes. There are walls and ceilings that don't move, along with large shelving units that are mostly stationary, but those things aren't going to do your localization system any favors since they all look the same. And the stuff that does offer some uniqueness, like the items on those shelves, is changing all the time. “That's a huge problem for us,” says Mohammed Kabir, Corvus Robotics' CTO. “Being able to do place recognition at the granularity that we need while everything is changing is really hard.” If you were looking closely at the video, you may have spotted some fiducials (optical patterns placed in the environment that vision systems find easy to spot), but we're told that the video was shot in Corvus Robotics' development warehouse where those markers are used for ground truth testing.
In real deployments, fiducials (or anything else) isn't necessary. The drone has its charging dock, and the initial map, but otherwise it's doing onboard visual-inertial SLAM (simultaneous localization and mapping), dense volumetric mapping, and motion planning with its 10 camera array and an autonomy stack running on ROS and PX4 for real time flight control. Corvus isn't willing to let us in on all of their secrets, but they did tell us that they incorporate some of the structured components of the environment into their SLAM solution, as well as some things are semi-static—that is, things that are unlikely to change over the duration of a single flight, helping the drone with loop closure.
One of the big parts of being able to do this is the ability to localize in very large, unstructured environments where things are constantly changing without having to rely on external infrastructure. For example, a WiFi connection back to our base station is not guaranteed, so everything needs to run on-board the drone, which is a non-trivial task. It's essentially all of the compute of a self-driving car, compressed into the drone. -Mohammed KabirCorvus is able to scan between 200 and 400 pallet positions per hour per drone, inclusive of recharge time. At ground level, this is probably about equivalent in speed to a human (although more sustainable). But as you start looking at inventory higher off the ground, the drone maintains a constant scan rate, while for a human, it gets exponentially harder, involving things like strapping yourself to a forklift. And of course the majority of the items in a high warehouse are not at ground level, because ground level only covers a tier or two of a space that may soar to 20 meters. Overall, Corvus says that they can do inventory up to 10x faster than a human.
With a few exceptions, it's unlikely that most warehouses are going to be able to go human-free in the foreseeable future, meaning that any time you talk about robot autonomy, you also have to talk about safety. “We can operate when no one's around, so our customers often schedule the drones during the third shift when the warehouse is dark,” says Mohammed Kabir. “There are also customers who want us to operate around people, which initially terrified us, because interacting with humans can be quite tricky. But over the last couple years, we've built safety systems to be able to deal with that.” In addition to the collision avoidance that comes with the 360 degree vision system that the drone uses to navigate, it has a variety of safety-first behaviors all the way up to searching for clear flat spots to land in the event of an emergency. But it sounds like the primary way that Corvus tries to maintain safety is by keeping drones and humans as separate as possible, which may involve process changes for the warehouse, explains Corvus Robotics CEO Jackie Wu. “If you see a drone in an aisle, just don't go in until it's done.”
We also asked Wu about what exactly he means when he calls the Corvus Robotics' drone “fully autonomous,” because depending on who you ask (and what kind of robot and task you're talking about), full autonomy can mean a lot of different things.
For us, full autonomy means continuous end to end operation with no human in the loop within a certain scenario or environment. Obviously, it's not level five autonomy, because nobody is doing level five, which would take some kind of generalized intelligence that can fly anywhere. But, for level four, for the warehouse interior, the drones fly on scheduled missions, intelligently find objects of interest while avoiding collisions, come back to land, recharge and share that data, all without anybody touching them. And we're able to do this repeatedly, without external localization infrastructure. -Jackie WuAs tempting as it is, we're not going to get into the weeds here about what exactly constitutes “full autonomy” in the context of drones. Well, okay, maybe we'll get into the weeds a little bit, just to say that being able to repeatedly do a useful task end-to-end without a human in the loop seems close enough to whatever your definition of full autonomy is that it's probably a fair term to apply here. Are there other drones that are arguably more autonomous, in the sense that they require even less structure in the environment? Sure. Are those same drones arguably less autonomous because they don't autonomously recharge? Probably. Corvus Robotics' perspective that the ability to run a drone autonomously for weeks at a time is a more important component of autonomy is perfectly valid considering their use case, but I think we're at the point where “full autonomy” at this level is becoming domain-specific enough to make direct comparisons difficult and maybe not all that useful.
Corvus has just recently come out of stealth, and they're currently working on pilot projects with a handful of Global 2000 companies. Continue reading

Posted in Human Robots

#439564 Video Friday: NASA Sending Robots to ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers.

It’s ICRA this week, but since the full proceedings are not yet available, we’re going to wait until we can access everything to cover the conference properly. Or, as properly as we can not being in Xi’an right now.

We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboCup 2021 – June 22-28, 2021 – [Online Event]
RSS 2021 – July 12-16, 2021 – [Online Event]
Humanoids 2020 – July 19-21, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

NASA has selected the DAVINCI+ (Deep Atmosphere Venus Investigation of Noble-gases, Chemistry and Imaging +) mission as part of its Discovery program, and it will be the first spacecraft to enter the Venus atmosphere since NASA’s Pioneer Venus in 1978 and USSR’s Vega in 1985.

The mission, Deep Atmosphere Venus Investigation of Noble gases, Chemistry, and Imaging Plus, will consist of a spacecraft and a probe. The spacecraft will track motions of the clouds and map surface composition by measuring heat emission from Venus’ surface that escapes to space through the massive atmosphere. The probe will descend through the atmosphere, sampling its chemistry as well as the temperature, pressure, and winds. The probe will also take the first high-resolution images of Alpha Regio, an ancient highland twice the size of Texas with rugged mountains, looking for evidence that past crustal water influenced surface materials.

Launch is targeted for FY2030.

[ NASA ]

Skydio has officially launched their 3D Scan software, turning our favorite fully autonomous drone into a reality capture system.

Skydio held a launch event at the U.S. Space & Rocket Center and the keynote is online; it's actually a fairly interesting 20 minutes with some cool rockets thrown in for good measure.

[ Skydio ]

Space robotics is a key technology for space exploration and an enabling factor for future missions, both scientific and commercial. Underwater tests are a valuable tool for validating robotic technologies for space. In DFKI’s test basin, even large robots can be tested in simulated micro-gravity with mostly unrestricted range of motion.

[ DFKI ]

The Harvard Microrobotics Lab has developed a soft robotic hand with dexterous soft fingers capable of some impressive in-hand manipulation, starting (obviously) with a head of broccoli.

Training soft robots in simulation has been a bit of a challenge, but the researchers developed their own simulation framework that matches the real world pretty closely:

The simulation framework is avilable to download and use, and you can do some nutty things with it, like simulating tentacle basketball:

I’d pay to watch that IRL.

[ Paper ] via [ Harvard ]

Using the navigation cameras on its mast, NASA’s Curiosity Mars rover this movie of clouds just after sunset on March 28, 2021, the 3,072nd so, or Martian day, of the mission. These noctilucent, or twilight clouds, are made of water ice; ice crystals reflect the setting sun, allowing the detail in each cloud to be seen more easily.

[ JPL ]

Genesis Robotics is working on something, and that's all we know.

[ Genesis Robotics ]

To further improve the autonomous capabilities of future space robots and to advance European efforts in this field, the European Union funded the ADE project, which was completed recently in Wulsbüttel near Bremen. There, the rover “SherpaTT” of the German Research Center for Artificial Intelligence (DFKI) managed to autonomously cover a distance of 500 meters in less than three hours thanks to the successful collaboration of 14 European partners.

[ DFKI ]

For $6.50, a NEXTAGE robot will make an optimized coffee for you. In Japan, of course.

[ Impress ]

Things I’m glad a robot is doing so that I don’t have to: dross skimming.

[ Fanuc ]

Today, anyone can hail a ride to experience the Waymo Driver with our fully autonomous ride-hailing service, Waymo One. Riders Ben and Ida share their experience on one of their recent multi-stop rides. Watch as they take us along for a ride.

[ Waymo ]

The IEEE Robotics and Automation Society Town Hall 2021 featured discussion around Diversity & Inclusion, RAS CARES committee & Code of Conduct, Gender Diversity, and the Developing Country Faculty Engagement Program.

[ IEEE RAS ] Continue reading

Posted in Human Robots

#439559 MIT is Building a Dynamic, Acrobatic ...

For a long time, having a bipedal robot that could walk on a flat surface without falling over (and that could also maybe occasionally climb stairs or something) was a really big deal. But we’re more or less past that now. Thanks to the talented folks at companies like Agility Robotics and Boston Dynamics, we now expect bipedal robots to meet or exceed actual human performance for at least a small subset of dynamic tasks. The next step seems to be to find ways of pushing the limits of human performance, which it turns out means acrobatics. We know that IHMC has been developing their own child-size acrobatic humanoid named Nadia, and now it sounds like researchers from Sangbae Kim’s lab at MIT are working on a new acrobatic robot of their own.

We’ve seen a variety of legged robots from MIT’s Biomimetic Robotics Lab, including Cheetah and HERMES. Recently, they’ve been doing a bunch of work with their spunky little Mini Cheetahs (developed with funding and support from Naver Labs), which are designed for some dynamic stuff like gait exploration and some low-key four-legged acrobatics.

In a paper recently posted to arXiv (to be presented at Humanoids 2020 in July), Matthew Chignoli, Donghyun Kim, Elijah Stanger-Jones, and Sangbae Kim describe “a new humanoid robot design, an actuator-aware kino-dynamic motion planner, and a landing controller as part of a practical system design for highly dynamic motion control of the humanoid robot.” So it’s not just the robot itself, but all of the software infrastructure necessary to get it to do what they want it to do.

MIT Humanoid performing a back flip off of a humanoid robot off of a 0.4 m platform in simulation.
Image: MIT

First let’s talk about the hardware that we’ll be looking at once the MIT Humanoid makes it out of simulation. It’s got the appearance of a sort of upright version of Mini Cheetah, but that appearance is deceiving, says MIT’s Matt Chignoli. While the robot’s torso and arms are very similar to Mini Cheetah, the leg design is totally new and features redesigned actuators with higher power and better torque density. “The main focus of the leg design is to enable smooth but dynamic ‘heel-to-toe’ actions that happen in humans’ walking and running, while maintaining low inertia for smooth interactions with ground contacts,” Chignoli told us in an email. “Dynamic ankle actions have been rare in humanoid robots. We hope to develop robust, low inertia and powerful legs that can mimic human leg actions.”

The design strategy matters because the field of humanoid robots is presently dominated by hydraulically actuated robots and robots with series elastic actuators. As we continue to improve the performance of our proprioceptive actuator technology, as we have done for this work, we aim to demonstrate that our unique combination of high torque density, high bandwidth force control, and the ability to mitigate impacts is optimal for highly dynamic locomotion of any legged robot, including humanoids.

-Matt Chignoli

Now, it’s easy to say “oh well pfft that’s just in simulation and you can get anything to work in simulation,” which, yeah, that’s kinda true. But MIT is putting a lot of work into accurately simulating everything that they possibly can—in particular, they’re modeling the detailed physical constraints that the robot operates under as it performs dynamic motions, allowing the planner to take those constraints into account and (hopefully) resulting in motions that match the simulation pretty accurately.

“When it comes to the physical capabilities of the robot, anything we demonstrate in simulation should be feasible on the robot,” Chignoli says. “We include in our simulations detailed models for the robot’s actuators and battery, models that have been validated experimentally. Such detailed models are not frequently included in dynamic simulations for robots.” But simulation is still simulation, of course, and no matter how good your modeling is, that transfer can be tricky, especially when doing highly dynamic motions.

“Despite our confidence in our simulator’s ability to accurately mimic the physical capabilities of our robot with high fidelity, there are aspects of our simulator that remain uncertain as we aim to deploy our acrobatic motions onto hardware,” Chignoli explains. “The main difficulty we see is state estimation. We have been drawing upon research related to state estimation for drones, which makes use of visual odometry. Without having an assembled robot to test these new estimation strategies on, though, it is difficult to judge the simulation to real transfer for these types of things.”

We’re told that the design of the MIT Humanoid is complete, and that the plan is to build it for real over the summer, with the eventual goal of doing parkour over challenging terrains. It’s tempting to fixate on the whole acrobatics and parkour angle of things (and we’re totally looking forward to some awesome videos), but according to Chignoli, the really important contribution here is the framework rather than the robot itself:

The acrobatic motions that we demonstrate on our small-scale humanoid are less about the actual acrobatics and more about what the ability to perform such feats implies for both our hardware as well as our control framework. The motions are important in terms of the robot’s capabilities because we are proving, at least in simulation, that we can replicate the dynamic feats of Boston Dynamics’ ATLAS robot using an entirely different actuation scheme (proprioceptive electromagnetic motors vs. hydraulic actuators, respectively). Verification that proprioceptive actuators can achieve the necessary torque density to perform such motions while retaining the advantages of low mechanical impedance and high-bandwidth torque control is important as people consider how to design the next generation of dynamic humanoid robots. Furthermore, the acrobatic motions demonstrate the ability of our “actuator-aware” motion planner to generate feasible motion plans that push the boundaries of what our robot can do.

The MIT Humanoid Robot: Design, Motion Planning, and Control For Acrobatic Behaviors, by Matthew Chignoli, Donghyun Kim, Elijah Stanger-Jones, and Sangbae Kim from MIT and UMass Amherst, will be presented at Humanoids 2020 this July. You can read a preprint on arXiv here. Continue reading

Posted in Human Robots