Tag Archives: humanoid robot

#439286 MIT is Building a Dynamic, Acrobatic ...

For a long time, having a bipedal robot that could walk on a flat surface without falling over (and that could also maybe occasionally climb stairs or something) was a really big deal. But we’re more or less past that now. Thanks to the talented folks at companies like Agility Robotics and Boston Dynamics, we now expect bipedal robots to meet or exceed actual human performance for at least a small subset of dynamic tasks. The next step seems to be to find ways of pushing the limits of human performance, which it turns out means acrobatics. We know that IHMC has been developing their own child-size acrobatic humanoid named Nadia, and now it sounds like researchers from Sangbae Kim’s lab at MIT are working on a new acrobatic robot of their own.

We’ve seen a variety of legged robots from MIT’s Biomimetic Robotics Lab, including Cheetah and HERMES. Recently, they’ve been doing a bunch of work with their spunky little Mini Cheetahs (developed with funding and support from Naver Labs), which are designed for some dynamic stuff like gait exploration and some low-key four-legged acrobatics.

In a paper recently posted to arXiv (to be presented at Humanoids 2020 in July), Matthew Chignoli, Donghyun Kim, Elijah Stanger-Jones, and Sangbae Kim describe “a new humanoid robot design, an actuator-aware kino-dynamic motion planner, and a landing controller as part of a practical system design for highly dynamic motion control of the humanoid robot.” So it’s not just the robot itself, but all of the software infrastructure necessary to get it to do what they want it to do.

Image: MIT

MIT Humanoid performing a back flip off of a humanoid robot off of a 0.4 m platform in simulation.

First let’s talk about the hardware that we’ll be looking at once the MIT Humanoid makes it out of simulation. It’s got the appearance of a sort of upright version of Mini Cheetah, but that appearance is deceiving, says MIT’s Matt Chignoli. While the robot’s torso and arms are very similar to Mini Cheetah, the leg design is totally new and features redesigned actuators with higher power and better torque density. “The main focus of the leg design is to enable smooth but dynamic ‘heel-to-toe’ actions that happen in humans’ walking and running, while maintaining low inertia for smooth interactions with ground contacts,” Chignoli told us in an email. “Dynamic ankle actions have been rare in humanoid robots. We hope to develop robust, low inertia and powerful legs that can mimic human leg actions.”

The design strategy matters because the field of humanoid robots is presently dominated by hydraulically actuated robots and robots with series elastic actuators. As we continue to improve the performance of our proprioceptive actuator technology, as we have done for this work, we aim to demonstrate that our unique combination of high torque density, high bandwidth force control, and the ability to mitigate impacts is optimal for highly dynamic locomotion of any legged robot, including humanoids.

-Matt Chignoli

Now, it’s easy to say “oh well pfft that’s just in simulation and you can get anything to work in simulation,” which, yeah, that’s kinda true. But MIT is putting a lot of work into accurately simulating everything that they possibly can—in particular, they’re modeling the detailed physical constraints that the robot operates under as it performs dynamic motions, allowing the planner to take those constraints into account and (hopefully) resulting in motions that match the simulation pretty accurately.

“When it comes to the physical capabilities of the robot, anything we demonstrate in simulation should be feasible on the robot,” Chignoli says. “We include in our simulations detailed models for the robot’s actuators and battery, models that have been validated experimentally. Such detailed models are not frequently included in dynamic simulations for robots.” But simulation is still simulation, of course, and no matter how good your modeling is, that transfer can be tricky, especially when doing highly dynamic motions.

“Despite our confidence in our simulator’s ability to accurately mimic the physical capabilities of our robot with high fidelity, there are aspects of our simulator that remain uncertain as we aim to deploy our acrobatic motions onto hardware,” Chignoli explains. “The main difficulty we see is state estimation. We have been drawing upon research related to state estimation for drones, which makes use of visual odometry. Without having an assembled robot to test these new estimation strategies on, though, it is difficult to judge the simulation to real transfer for these types of things.”

We’re told that the design of the MIT Humanoid is complete, and that the plan is to build it for real over the summer, with the eventual goal of doing parkour over challenging terrains. It’s tempting to fixate on the whole acrobatics and parkour angle of things (and we’re totally looking forward to some awesome videos), but according to Chignoli, the really important contribution here is the framework rather than the robot itself:

The acrobatic motions that we demonstrate on our small-scale humanoid are less about the actual acrobatics and more about what the ability to perform such feats implies for both our hardware as well as our control framework. The motions are important in terms of the robot’s capabilities because we are proving, at least in simulation, that we can replicate the dynamic feats of Boston Dynamics’ ATLAS robot using an entirely different actuation scheme (proprioceptive electromagnetic motors vs. hydraulic actuators, respectively). Verification that proprioceptive actuators can achieve the necessary torque density to perform such motions while retaining the advantages of low mechanical impedance and high-bandwidth torque control is important as people consider how to design the next generation of dynamic humanoid robots. Furthermore, the acrobatic motions demonstrate the ability of our “actuator-aware” motion planner to generate feasible motion plans that push the boundaries of what our robot can do.

The MIT Humanoid Robot: Design, Motion Planning, and Control For Acrobatic Behaviors, by Matthew Chignoli, Donghyun Kim, Elijah Stanger-Jones, and Sangbae Kim from MIT and UMass Amherst, will be presented at Humanoids 2020 this July. You can read a preprint on arXiv here. Continue reading

Posted in Human Robots

#439241 The MIT humanoid robot: A dynamic ...

Creating robots that can perform acrobatic movements such as flips or spinning jumps can be highly challenging. Typically, in fact, these robots require sophisticated hardware designs, motion planners and control algorithms. Continue reading

Posted in Human Robots

#438882 Robotics in the entertainment industry

Mesmer Entertainment Robotics demonstrate some of their humanoid animatronics, as well as their humanoid robot, Owen.

Posted in Human Robots

#439200 How Disney Imagineering Crammed a ...

From what I’ve seen of humanoid robotics, there’s a fairly substantial divide between what folks in the research space traditionally call robotics, and something like animatronics, which tends to be much more character-driven.

There’s plenty of technology embodied in animatronic robotics, but usually under some fairly significant constraints—like, they’re not autonomously interactive, or they’re stapled to the floor and tethered for power, things like that. And there are reasons for doing it this way: namely, dynamic untethered humanoid robots are already super hard, so why would anyone stress themselves out even more by trying to make them into an interactive character at the same time? That would be crazy!

At Walt Disney Imagineering, which is apparently full of crazy people, they’ve spent the last three years working on Project Kiwi: a dynamic untethered humanoid robot that’s an interactive character at the same time. We asked them (among other things) just how they managed to stuff all of the stuff they needed to stuff into that costume, and how they expect to enable children (of all ages) to interact with the robot safely.

Project Kiwi is an untethered bipedal humanoid robot that Disney Imagineering designed not just to walk without falling over, but to walk without falling over with some character. At about 0.75 meters tall, Kiwi is a bit bigger than a NAO and a bit smaller than an iCub, and it’s just about completely self-contained, with the tether you see in the video being used for control rather than for power. Kiwi can manage 45 minutes of operating time, which is pretty impressive considering its size and the fact that it incorporates a staggering 50 degrees of freedom, a requirement for lifelike motion.

This version of the robot is just a prototype, and it sounds like there’s plenty to do in terms of hardware optimization to improve efficiency and add sensing and interactivity. The most surprising thing to me is that this is not a stage robot: Disney does plan to have some future version of Kiwi wandering around and interacting directly with park guests, and I’m sure you can imagine how that’s likely to go. Interaction at this level, where there’s a substantial risk of small children tackling your robot with a vicious high-speed hug, could be a uniquely Disney problem for a robot with this level of sophistication. And it’s one of the reasons they needed to build their own robot—when Universal Studios decided to try out a Steampunk Spot, for example, they had to put a fence plus a row of potted plants between it and any potential hugs, because Spot is very much not a hug-safe robot.

So how the heck do you design a humanoid robot from scratch with personality and safe human interaction in mind? We asked Scott LaValley, Project Kiwi lead, who came to Disney Imagineering by way of Boston Dynamics and some of our favorite robots ever (including RHex, PETMAN, and Atlas), to explain how they pulled it off.

IEEE Spectrum: What are some of the constraints of Disney’s use case that meant you had to develop your own platform from the ground up?

Scott LaValley: First and foremost, we had to consider the packaging constraints. Our robot was always intended to serve as a bipedal character platform capable of taking on the role of a variety of our small-size characters. While we can sometimes take artistic liberties, for the most part, the electromechanical design had to fit within a minimal character profile to allow the robot to be fully themed with shells, skin, and costuming. When determining the scope of the project, a high-performance biped that matched our size constraints just did not exist.

Equally important was the ability to move with style and personality, or the “emotion of motion.” To really capture a specific character performance, a robotic platform must be capable of motions that range from fast and expressive to extremely slow and nuanced. In our case, this required developing custom high-speed actuators with the necessary torque density to be packaged into the mechanical structure. Each actuator is also equipped with a mechanical clutch and inline torque sensor to support low-stiffness control for compliant interactions and reduced vibration.

Designing custom hardware also allowed us to include additional joints that are uncommon in humanoid robots. For example, the clavicle and shoulder alone include five degrees of freedom to support a shrug function and an extended configuration space for more natural gestures. We were also able to integrate onboard computing to support interactive behaviors.

What compromises were required to make sure that your robot was not only functional, but also capable of becoming an expressive character?

As mentioned previously, we face serious challenges in terms of packaging and component selection due to the small size and character profile. This has led to a few compromises on the design side. For example, we currently rely on rigid-flex circuit boards to fit our electronics onto the available surface area of our parts without additional cables or connectors. Unfortunately, these boards are harder to design and manufacture than standard rigid boards, increasing complexity, cost, and build time. We might also consider increasing the size of the hip and knee actuators if they no longer needed to fit within a themed costume.

Designing a reliable walking robot is in itself a significant challenge, but adding style and personality to each motion is a new layer of complexity. From a software perspective, we spend a significant amount of time developing motion planning and animation tools that allow animators to author stylized gaits, gestures, and expressions for physical characters. Unfortunately, unlike on-screen characters, we do not have the option to bend the laws of physics and must validate each motion through simulation. As a result, we are currently limited to stylized walking and dancing on mostly flat ground, but we hope to be skipping up stairs in the future!

Of course, there is always more that can be done to better match the performance you would expect from a character. We are excited about some things we have in the pipeline, including a next generation lower body and an improved locomotion planner.

How are you going to make this robot safe for guests to be around?

First let us say, we take safety extremely seriously, and it is a top priority for any Disney experience. Ultimately, we do intend to allow interactions with guests of all ages, but it will take a measured process to get there. Proper safety evaluation is a big part of productizing any Research & Development project, and we plan to conduct playtests with our Imagineers, cast members and guests along the way. Their feedback will help determine exactly what an experience with a robotic character will look like once implemented.

From a design standpoint, we believe that small characters are the safest type of biped for human-robot interaction due to their reduced weight and low center of mass. We are also employing compliant control strategies to ensure that the robot’s actuators are torque-limited and backdrivable. Perception and behavior design may also play a key role, but in the end, we will rely on proper show design to permit a safe level of interaction as the technology evolves.

What do you think other roboticists working on legged systems could learn from Project Kiwi?

We are often inspired by other roboticists working on legged systems ourselves but would be happy to share some lessons learned. Remember that robotics is fundamentally interdisciplinary, and a good team typically consists of a mix of hardware and software engineers in close collaboration. In our experience, however, artists and animators play an equally valuable role in bringing a new vision to life. We often pull in ideas from the character animation and game development world, and while robotic characters are far more constrained than their virtual counterparts, we are solving many of the same problems. Another tip is to leverage motion studies (either through animation, motion capture, and/or simulation tools) early in the design process to generate performance-driven requirements for any new robot.

Now that Project Kiwi has de-stealthed, I hope the Disney Imagineering folks will be able to be a little more open with all of the sweet goo inside of the fuzzy skin of this metaphor that has stopped making sense. Meeting a new humanoid robot is always exciting, and the approach here (with its technical capability combined with an emphasis on character and interaction) is totally unique. And if they need anyone to test Kiwi’s huggability, I volunteer! You know, for science. Continue reading

Posted in Human Robots

#438286 Humanoids that’ll blow your mind!

Here, the PRO Robots Channel highlights five of the most advanced humanoid robots.

Posted in Human Robots