Tag Archives: robot

#439089 Ingenuity’s Chief Pilot Explains How ...

On April 11, the Mars helicopter Ingenuity will take to the skies of Mars for the first time. It will do so fully autonomously, out of necessity—the time delay between Ingenuity’s pilots at the Jet Propulsion Laboratory and Jezero Crater on Mars makes manual or even supervisory control impossible. So the best that the folks at JPL can do is practice as much as they can in simulation, and then hope that the helicopter can handle everything on its own.

Here on Earth, simulation is a critical tool for many robotics applications, because it doesn’t rely on access to expensive hardware, is non-destructive, and can be run in parallel and at faster-than-real-time speeds to focus on solving specific problems. Once you think you’ve gotten everything figured out in simulation, you can always give it a try on the real robot and see how close you came. If it works in real life, great! And if not, well, you can tweak some stuff in the simulation and try again.

For the Mars helicopter, simulation is much more important, and much higher stakes. Testing the Mars helicopter under conditions matching what it’ll find on Mars is not physically possible on Earth. JPL has flown engineering models in Martian atmospheric conditions, and they’ve used an actuated tether to mimic Mars gravity, but there’s just no way to know what it’ll be like flying on Mars until they’ve actually flown on Mars. With that in mind, the Ingenuity team has been relying heavily on simulation, since that’s one of the best tools they have to prepare for their Martian flights. We talk with Ingenuity’s Chief Pilot, Håvard Grip, to learn how it all works.

Ingenuity Facts:
Body Size: a box of tissues

Brains: Qualcomm Snapdragon 801

Weight: 1.8 kilograms

Propulsion: Two 1.2m carbon fiber rotors

Navigation sensors: VGA camera, laser altimeter, inclinometer

Ingenuity is scheduled to make its first flight no earlier than April 11. Before liftoff, the Ingenuity team will conduct a variety of pre-flight checks, including verifying the responsiveness of the control system and spinning the blades up to full speed (2,537 rpm) without lifting off. If everything looks good, the first flight will consist of a 1 meter per second climb to 3 meters, 30 seconds of hover at 3 meters while rotating in place a bit, and then a descent to landing. If Ingenuity pulls this off, that will have made its entire mission a success. There will be more flights over the next few weeks, but all it takes is one to prove that autonomous helicopter flight on Mars is possible.

Last month, we spoke with Mars Helicopter Operations Lead Tim Canham about Ingenuity’s hardware, software, and autonomy, but we wanted to know more about how the Ingenuity team has been using simulation for everything from vehicle design to flight planning. To answer our questions, we talked with JPL’s Håvard Grip, who led the development of Ingenuity’s navigation and flight control systems. Grip also has the title of Ingenuity Chief Pilot, which is pretty awesome. He summarizes this role as “operating the flight control system to make the helicopter do what we want it to do.”

IEEE Spectrum: Can you tell me about the simulation environment that JPL uses for Ingenuity’s flight planning?

Håvard Grip: We developed a Mars helicopter simulation ourselves at JPL, based on a multi-body simulation framework that’s also developed at JPL, called DARTS/DSHELL. That's a system that has been in development at JPL for about 30 years now, and it's been used in a number of missions. And so we took that multibody simulation framework, and based on it we built our own Mars helicopter simulation, put together our own rotor model, our own aerodynamics models, and everything else that's needed in order to simulate a helicopter. We also had a lot of help from the rotorcraft experts at NASA Ames and NASA Langley.

Image: NASA/JPL

Ingenuity in JPL’s flight simulator.

Without being able to test on Mars, how much validation are you able to do of what you’re seeing in simulation?

We can do a fair amount, but it requires a lot of planning. When we made our first real prototype (with a full-size rotor that looked like what we were thinking of putting on Mars) we first spent a lot of time designing it and using simulation tools to guide that design, and when we were sufficiently confident that we were close enough, and that we understood enough about it, then we actually built the thing and designed a whole suite of tests in a vacuum chamber where where we could replicate Mars atmospheric conditions. And those tests were before we tried to fly the helicopter—they were specifically targeted at what we call system identification, which has to do with figuring out what the true properties, the true dynamics of a system are, compared to what we assumed in our models. So then we got to see how well our models did, and in the places where they needed adjustment, we could go back and do that.

The simulation work that we really started after that very first initial lift test, that’s what allowed us to unlock all of the secrets to building a helicopter that can fly on Mars.
—Håvard Grip, Ingenuity Chief Pilot

We did a lot of this kind of testing. It was a big campaign, in several stages. But there are of course things that you can't fully replicate, and you do depend on simulation to tie things together. For example, we can't truly replicate Martian gravity on Earth. We can replicate the atmosphere, but not the gravity, and so we have to do various things when we fly—either make the helicopter very light, or we have to help it a little bit by pulling up on it with a string to offload some of the weight. These things don't fully replicate what it will be like on Mars. We also can't simultaneously replicate the Mars aerodynamic environment and the physical and visual surroundings that the helicopter will be flying in. These are places where simulation tools definitely come in handy, with the ability to do full flight tests from A to B, with the helicopter taking off from the ground, running the flight software that it will be running on board, simulating the images that the navigation camera takes of the ground below as it flies, feeding that back into the flight software, and then controlling it.

To what extent can simulation really compensate for the kinds of physical testing that you can’t do on Earth?

It gives you a few different possibilities. We can take certain tests on Earth where we replicate key elements of the environment, like the atmosphere or the visual surroundings for example, and you can validate your simulation on those parameters that you can test on Earth. Then, you can combine those things in simulation, which gives you the ability to set up arbitrary scenarios and do lots and lots of tests. We can Monte Carlo things, we can do a flight a thousand times in a row, with small perturbations of various parameters and tease out what our sensitivities are to those things. And those are the kinds of things that you can't do with physical tests, both because you can't fully replicate the environment and also because of the resources that would be required to do the same thing a thousand times in a row.

Because there are limits to the physical testing we can do on Earth, there are elements where we know there's more uncertainty. On those aspects where the uncertainty is high, we tried to build in enough margin that we can handle a range of things. And simulation gives you the ability to then maybe play with those parameters, and put them at their outer limits, and test them beyond where the real parameters are going to be to make sure that you have robustness even in those extreme cases.

How do you make sure you’re not relying on simulation too much, especially since in some ways it’s your only option?

It’s about anchoring it in real data, and we’ve done a lot of that with our physical testing. I think what you’re referring to is making your simulation too perfect, and we’re careful to model the things that matter. For example, the simulated sensors that we use have realistic levels of simulated noise and bias in them, the navigation camera images have realistic levels of degradation, we have realistic disturbances from wind gusts. If you don’t properly account for those things, then you’re missing important details. So, we try to be as accurate as we can, and to capture that by overbounding in areas where we have a high degree of uncertainty.

What kinds of simulated challenges have you put the Mars helicopter through, and how do you decide how far to push those challenges?

One example is that we can simulate going over rougher terrain. We can push that, and see how far we can go and still have the helicopter behave the way that we want it to. Or we can inject levels of noise that maybe the real sensors don't see, but you want to just see how far you can push things and make sure that it's still robust.

Where we put the limits on this and what we consider to be realistic is often a challenge. We consider this on a case by case basis—if you have a sensor that you're dealing with, you try to do testing with it to characterize it and understand its performance as much as possible, and you build a level of confidence in it that allows you to find the proper balance.

When it comes to things like terrain roughness, it's a little bit of a different thing, because we're actually picking where we're flying the helicopter. We have made that choice, and we know what the terrain looks like around us, so we don’t have to wonder about that anymore.

Image: NASA/JPL-Caltech/University of Arizona

Satellite image of the Ingenuity flight area.

The way that we’re trying to approach this operationally is that we should be done with the engineering at this point. We’re not depending on going back and resimulating things, other than a few checks here and there.

Are there any examples of things you learned as part of the simulation process that resulted in changes to the hardware or mission?

You know, it’s been a journey. One of the early things that we discovered as part of modeling the helicopter was that the rotor dynamics were quite different for a helicopter on Mars, in particular with respect to how the rotor responds to the up and down bending of the blades because they’re not perfectly rigid. That motion is a very important influence on the overall flight dynamics of the helicopter, and what we discovered as we started modeling was that this motion is damped much less on Mars. Under-damped oscillatory things like that, you kind of figure might pose a control issue, and that is the case here: if you just naively design it as you might a helicopter on Earth, without taking this into account, you could have a system where the response to control inputs becomes very sluggish. So that required changes to the vehicle design from some of the very early concepts, and it led us to make a rotor that’s extremely light and rigid.

The design cycle for the Mars helicopter—it’s not like we could just build something and take it out to the back yard and try it and then come back and tweak it if it doesn’t work. It’s a much bigger effort to build something and develop a test program where you have to use a vacuum chamber to test it. So you really want to get as close as possible up front, on your first iteration, and not have to go back to the drawing board on the basic things.

So how close were you able to get on your first iteration of the helicopter design?

[This video shows] a very early demo which was done more or less just assuming that things were going to behave as they would on Earth, and that we’d be able to fly in a Martian atmosphere just spinning the rotor faster and having a very light helicopter. We were basically just trying to demonstrate that we could produce enough lift. You can see the helicopter hopping around, with someone trying to joystick it, but it turned out to be very hard to control. This was prior to doing any of the modeling that I talked about earlier. But once we started seriously focusing on the modeling and simulation, we then went on to build a prototype vehicle which had a full-size rotor that’s very close to the rotor that will be flying on Mars. One difference is that prototype had cyclic control only on the lower rotor, and later we added cyclic control on the upper rotor as well, and that decision was informed in large part by the work we did in simulation—we’d put in the kinds of disturbances that we thought we might see on Mars, and decided that we needed to have the extra control authority.

How much room do you think there is for improvement in simulation, and how could that help you in the future?

The tools that we have were definitely sufficient for doing the job that we needed to do in terms of building a helicopter that can fly on Mars. But simulation is a compute-intensive thing, and so I think there’s definitely room for higher fidelity simulation if you have the compute power to do so. For a future Mars helicopter, you could get some benefits by more closely coupling together high-fidelity aerodynamic models with larger multi-body models, and doing that in a fast way, where you can iterate quickly. There’s certainly more potential for optimizing things.

Photo: NASA/JPL-Caltech

Ingenuity preparing for flight.

Watching Ingenuity’s first flight take place will likely be much like watching the Perseverance landing—we’ll be able to follow along with the Ingenuity team while they send commands to the helicopter and receive data back, although the time delay will mean that any kind of direct control won’t be possible. If everything goes the way it’s supposed to, there will hopefully be some preliminary telemetry from Ingenuity saying so, but it sounds like we’ll likely have to wait until April 12 before we get pictures or video of the flight itself.

Because Mars doesn’t care what time it is on Earth, the flight will actually be taking place very early on April 12, with the JPL Mission Control livestream starting at 3:30 a.m. EDT (12:30 a.m. PDT). Details are here. Continue reading

Posted in Human Robots

#439087 In an AI world we need to teach students ...

Robots are writing more of what we read on the internet. And artificial intelligence (AI) writing tools are becoming freely available for anyone, including students, to use. Continue reading

Posted in Human Robots

#439081 Classify This Robot-Woven Sneaker With ...

For athletes trying to run fast, the right shoe can be essential to achieving peak performance. For athletes trying to run fast as humanly possible, a runner’s shoe can also become a work of individually customized engineering.

This is why Adidas has married 3D printing with robotic automation in a mass-market footwear project it’s called Futurecraft.Strung, expected to be available for purchase as soon as later this year. Using a customized, 3D-printed sole, a Futurecraft.Strung manufacturing robot can place some 2,000 threads from up to 10 different sneaker yarns in one upper section of the shoe.

Skylar Tibbits, founder and co-director of the Self-Assembly Lab and associate professor in MIT's Department of Architecture, says that because of its small scale, footwear has been an area of focus for 3D printing and additive manufacturing, which involves adding material bit by bit.

“There are really interesting complex geometry problems,” he says. “It’s pretty well suited.”

Photo: Adidas

Beginning with a 3D-printed sole, Adidas robots weave together some 2000 threads from up to 10 different sneaker yarns to make one Futurecraft.Strung shoe—expected on the marketplace later this year or sometime in 2022.

Adidas began working on the Futurecraft.Strung project in 2016. Then two years later, Adidas Futurecraft, the company’s innovation incubator, began collaborating with digital design studio Kram/Weisshaar. In less than a year the team built the software and hardware for the upper part of the shoe, called Strung uppers.

“Most 3D printing in the footwear space has been focused on the midsole or outsole, like the bottom of the shoe,” Tibbits explains. But now, he says, Adidas is bringing robotics and a threaded design to the upper part of the shoe. The company bases its Futurecraft.Strung design on high-resolution scans of how runners’ feet move as they travel.

This more flexible design can benefit athletes in multiple sports, according to an Adidas blog post. It will be able to use motion capture of an athlete’s foot and feedback from the athlete to make the design specific to the athlete’s specific gait. Adidas customizes the weaving of the shoe’s “fabric” (really more like an elaborate woven string figure, a cat’s cradle to fit the foot) to achieve a close and comfortable fit, the company says.

What they call their “4D sole” consists of a design combining 3D printing with materials that can change their shape and properties over time. In fact, Tibbits coined the term 4D printing to describe this process in 2013. The company takes customized data from the Adidas Athlete Intelligent Engine to make the shoe, according to Kram/Weisshaar’s website.

Photo: Adidas

Closeup of the weaving process behind a Futurecraft.Strung shoe

“With Strung for the first time, we can program single threads in any direction, where each thread has a different property or strength,” Fionn Corcoran-Tadd, an innovation designer at Adidas’ Futurecraft lab, said in a company video. Each thread serves a purpose, the video noted. “This is like customized string art for your feet,” Tibbits says.

Although the robotics technology the company uses has been around for many years, what Adidas’s robotic weavers can achieve with thread is a matter of elaborate geometry. “It’s more just like a really elegant way to build up material combining robotics and the fibers and yarns into these intricate and complex patterns,” he says.

Robots can of course create patterns with more precision than if someone wound it by hand, as well as rapidly and reliably changing the yarn and color of the fabric pattern. Adidas says it can make a single upper in 45 minutes and a pair of sneakers in 1 hour and 30 minutes. It plans to reduce this time down to minutes in the months ahead, the company said.

An Adidas spokesperson says sneakers incorporating the Futurecraft.Strung uppers design are a prototype, but the company plans to bring a Strung shoe to market in late 2021 or 2022. However, Adidas Futurecraft sneakers are currently available with a 3D-printed midsole.
Adidas plans to continue gathering data from athletes to customize the uppers of sneakers. “We’re building up a library of knowledge and it will get more interesting as we aggregate data of testing and from different athletes and sports,” the Adidas Futurecraft team writes in a blog post. “The more we understand about how data can become design code, the more we can take that and apply it to new Strung textiles. It’s a continuous evolution.” Continue reading

Posted in Human Robots

#438073 Ball-shooting Robot

Le Bron look out – Humanoids will soon beat you at shooting hoops! Well, maybe in the near future…

Posted in Human Robots

#439066 Video Friday: Festo’s BionicSwift

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Festo's Bionic Learning Network for 2021 presents a flock of BionicSwifts.

To execute the flight maneuvers as true to life as possible, the wings are modeled on the plumage of birds. The individual lamellae are made of an ultralight, flexible but very robust foam and lie on top of each other like shingles. Connected to a carbon quill, they are attached to the actual hand and arm wings as in the natural model.

During the wing upstroke, the individual lamellae fan out so that air can flow through the wing. This means that the birds need less force to pull the wing up. During the downstroke, the lamellae close up so that the birds can generate more power to fly. Due to this close-to-nature replica of the wings, the BionicSwifts have a better flight profile than previous wing-beating drives.

[ Festo ]

While we've seen a wide variety of COVID-motivated disinfecting robots, they're usually using either ultraviolet light or a chemical fog. This isn't the way that humans clean—we wipe stuff down, which gets rid of surface dirt and disinfects at the same time. Fraunhofer has been working on a mobile manipulator that can clean in the same ways that we do.

It's quite the technical challenge, but it has the potential to be both more efficient and more effective.

[ Fraunhofer ]

In recent years, robots have gained artificial vision, touch, and even smell. “Researchers have been giving robots human-like perception,” says MIT Associate Professor Fadel Adib. In a new paper, Adib’s team is pushing the technology a step further. “We’re trying to give robots superhuman perception,” he says. The researchers have developed a robot that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view.

[ MIT ]

Ingenuity is now scheduled to fly on April 11.

[ JPL ]

The legendary Zenta is back after a two year YouTube hiatus with “a kind of freaky furry hexapod bunny creature.”

[ Zenta ]

It is with great pride and excitement that the South Australia Police announce a new expansion to their kennel by introducing three new Police Dog (PD) recruits. These dogs have been purposely targeted to bring a whole new range of dog operational capabilities known as the ‘small area urban search and guided evacuation’ dogs. Police have been working closely with specialist vets and dog trainers to ascertain if the lightweight dogs could be transported safely by drones and released into hard-to-access areas where at the moment the larger PDs just simply cannot get in due to their size.

[ SA Police ]

SoftBank may not have Spot cheerleading robots for their baseball team anymore, but they've more than made up for it with a full century of Peppers. And one dude doing the robot.

[ SoftBank ]

MAB Robotics is a Polish company developing walking robots for inspection, and here's a prototype they've been working on.

[ MAB Robotics ]

Thanks Jakub!

DoraNose: Smell your way to a better tomorrow.

[ Dorabot ]

Our robots need to learn how to cope with their new neighbors, and we have just the solution for this, the egg detector! Using cutting-edge AI, it provides incredible precision in detecting a vast variety of eggs. We have deployed this new feature on Boston Dynamics Spot, one of our fleet's robots. It can now detect eggs with its cameras and avoid them on his autonomous missions.

[ Energy Robotics ]

When dropping a squishy robot from an airplane 1,000 feet up, make sure that you land as close to people's cars as you can.

Now do it from orbit!

[ Squishy Robotics ]

An autonomous robot that is able to physically guide humans through narrow and cluttered spaces could be a big boon to the visually-impaired. Most prior robotic guiding systems are based on wheeled platforms with large bases with actuated rigid guiding canes. The large bases and the actuated arms limit these prior approaches from operating in narrow and cluttered environments. We propose a method that introduces a quadrupedal robot with a leash to enable the robot-guiding-human system to change its intrinsic dimension (by letting the leash go slack) in order to fit into narrow spaces.

[ Hybrid Robotics ]

How to prove that your drone is waterproof.

[ UNL ]

Well this ought to be pretty good once it gets out of simulation.

[ Hybrid Robotics ]

MIDAS is Aurora’s AI-enabled, multi-rotor sUAV outfitted with optical sensors and a customized payload that can defeat multiple small UAVs per flight with low-collateral effects.

[ Aurora ]

The robots​ of the DFKI have the advantage of being able to reach extreme environments: they can be used for decontamination purposes in high-risk areas or inspect and maintain underwater​ structures, for which they are tested in the North Sea near Heligoland​.

[ DFKI ]

After years of trying, 60 Minutes cameras finally get a peek inside the workshop at Boston Dynamics, where robots move in ways once only thought possible in movies. Anderson Cooper reports.

[ 60 Minutes ]

In 2007, Noel Sharky stated that “we are sleepwalking into a brave new world where robots decide who, where and when to kill.” Since then thousands of AI and robotics researchers have joined his calls to regulate “killer robots.” But sometime this year, Turkey will deploy fully autonomous home-built kamikaze drones on its border with Syria. What are the ethical choices we need to consider? Will we end up in an episode of Black Mirror? Or is the UN listening to calls and starting the process of regulating this space? Prof. Toby Walsh will discuss this important issue, consider where we are at and where we need to go.

[ ICRA 2020 ]

In the second session of HAI's spring conference, artists and technologists discussed how technology can enhance creativity, reimagine meaning, and support racial and social justice. The conference, called “Intelligence Augmentation: AI Empowering People to Solve Global Challenges,” took place on 25 March 2021.

[ Stanford HAI ]

This spring 2021 GRASP SFI comes from Monroe Kennedy III at Stanford University, on “Considerations for Human-Robot Collaboration.”

The field of robotics has evolved over the past few decades. We’ve seen robots progress from the automation of repetitive tasks in manufacturing to the autonomy of mobilizing in unstructured environments to the cooperation of swarm robots that are centralized or decentralized. These abilities have required advances in robotic hardware, modeling, and artificial intelligence. The next frontier is robots collaborating in complex tasks with human teammates, in environments traditionally configured for humans. While solutions to this challenge must utilize all the advances of robotics, the human element adds a unique aspect that must be addressed. Collaborating with a human teammate means that the robot must have a contextual understanding of the task as well as all participant’s roles. We will discuss what constitutes an effective teammate and how we can capture this behavior in a robotic collaborator.

[ UPenn ] Continue reading

Posted in Human Robots