Tag Archives: awesome

#439183 This Week’s Awesome Tech Stories From ...

ROBOTICS
The Robot Surgeon Will See You Now
Cade Metz | The New York Times
“Real scalpels, artificial intelligence—what could go wrong? …The [Berkeley] project is a part of a much wider effort to bring artificial intelligence into the operating room. Using many of the same technologies that underpin self-driving cars, autonomous drones and warehouse robots, researchers are working to automate surgical robots too. These methods are still a long way from everyday use, but progress is accelerating.”

FUTURE
This Tech Was Science Fiction 20 Years Ago. Now It’s Reality
Luke Dormehl | Digital Trends
“A couple of decades ago, kids were reading Harry Potter books, Pixar movies were all the rage, and Microsoft’s Xbox and Sony’s PlayStation were battling it out for video game supremacy. That doesn’t sound all that different from 2021. But technology has come a long way in that time. Not only is today’s tech far more powerful than it was 20 years ago, but a lot of the gadgets we thought of as science fiction have become part of our lives.”

LONGEVITY
How Long Can We Live?
Ferris Jabr | The New York Times Magazine
“As the global population approaches eight billion, and science discovers increasingly promising ways to slow or reverse aging in the lab, the question of human longevity’s potential limits is more urgent than ever. When their work is examined closely, it’s clear that longevity scientists hold a wide range of nuanced perspectives on the future of humanity.”

3D PRINTING
Forget Digging for Fossils. This Museum Printed a Full T-Rex Skeleton Instead
Luke Dormehl | Digital Trends
“For a team of researchers at the Naturalis Biodiversity Center in Leiden, the Netherlands, copying a T. rex took some state-of-the-art laser scanning technology, a giant 3D printer, a just-as-sizable postage bill, almost 45 million square millimeters of acrylic paint, and a group of experts wishing to push the boundaries of additive manufacturing.”

HEALTH
One Vaccine to Rule Them All
James Hamblin | The Atlantic
“i‘A universal SARS-CoV-2 vaccine is step one,’ [Anthony] Fauci said. Step two would be a universal coronavirus vaccine, capable of protecting us not only from SARS-CoV-2 in all its forms, but also from the inevitable emergence of new and different coronaviruses that might cause future pandemics. The race to create such a vaccine may prove one of the great feats of a generation.”

TECHNOLOGY
These Materials Could Make Science Fiction a Reality
John Markoff | The New York Times
“Imagine operating a computer by moving your hands in the air as Tony Stark does in Iron Man. Or using a smartphone to magnify an object as does the device that Harrison Ford’s character uses in Blade Runner. …These advances and a host of others on the horizon could happen because of metamaterials, making it possible to control beams of light with the same ease that computer chips control electricity.”

DRONES
Wingcopter Debuts a Triple-Drop Drone to Create ‘Logistical Highways in the Sky’
Aria Alamalhodaei | TechCrunch
“The Wingcopter 198, which was revealed Tuesday, is capable of making three separate deliveries per flight, the company said. Wingcopter has couched this multi-stop capability as a critical feature that will allow it to grow a cost-efficient—and hopefully profitable—drone-delivery-as-a-service business.”

SPACE
The Asteroid Impact Simulation Has Ended in Disaster
George Dvorsky | Gizmodo
“An international exercise to simulate an asteroid striking Earth has come to an end. With just six days to go before a fictitious impact, things don’t look good for a 185-mile-wide region between Prague and Munich. …This may sound like a grim role-playing game, but it’s very serious business. Led by NASA’s Jet Propulsion Laboratory’s Center for Near Earth Object Studies, the asteroid impact simulation is meant to prepare scientists, planners, and key decision makers for the real thing, should it ever occur.”

Image Credit: mitsal dian / Unsplash Continue reading

Posted in Human Robots

#439157 This Week’s Awesome Tech Stories From ...

COMPUTING
Now for AI’s Latest Trick: Writing Computer Code
Will Knight | Wired
“It can take years to learn how to write computer code well. SourceAI, a Paris startup, thinks programming shouldn’t be such a big deal. The company is fine-tuning a tool that uses artificial intelligence to write code based on a short text description of what the code should do. Tell the company’s tool to ‘multiply two numbers given by a user,’ for example, and it will whip up a dozen or so lines in Python to do just that.”

SPACE
NASA’s Perseverance Rover Just Turned CO2 Into Oxygen
Morgan McFall-Johnsen | Business Insider
“That’s good news for the prospect of sending human explorers to Mars. Oxygen takes up a lot of room on a spacecraft, and it’s unlikely that astronauts will be able to bring enough with them to Mars. So they’ll need to produce their own oxygen from the Martian atmosphere, both for breathing and for fueling rockets to return to Earth.”

ARTIFICIAL INTELLIGENCE
Latest Neural Nets Solve World’s Hardest Equations Faster Than Ever Before
Anil Ananthaswamy | Quanta
“…researchers have built new kinds of artificial neural networks that can approximate solutions to partial differential equations orders of magnitude faster than traditional PDE solvers. And once trained, the new neural nets can solve not just a single PDE but an entire family of them without retraining.”

SPACE
NASA’s Bold Bet on Starship for the Moon May Change Spaceflight Forever
Eric Berger | Ars Technica
“Until now, the plans NASA had contemplated for human exploration in deep space all had echoes of the Apollo program. …By betting on Starship, which entails a host of development risks, NASA is taking a chance on what would be a much brighter future. One in which not a handful of astronauts go to the Moon or Mars, but dozens and then hundreds. In this sense, Starship represents a radical departure for NASA and human exploration.”

AUTOMATION
Who Will Win the Self-Driving Race? Here Are Eight Possibilities
Timothy B. Lee | Ars Technica
“…predicting what the next couple of years will bring is a challenge. So rather than offering a single prediction, here are eight: I’ve broken down the future into eight possible scenarios, each with a rough probability. …A decade from now, we’ll be able to look back and say which companies or approaches were on the right track. For now, we can only guess.”

TECHNOLOGY
Europe’s Proposed Limits on AI Would Have Global Consequences
Will Knight | Wired
“The rules are the most significant international effort to regulate AI to date, covering facial recognition, autonomous driving, and the algorithms that drive online advertising, automated hiring, and credit scoring. The proposed rules could help shape global norms and regulations around a promising but contentious technology.”

SCIENCE
What Do You Call a Bunch of Black Holes: A Crush? A Scream?
Dennis Overbye | The New York Times
“[Astrophysicist Jocelyn Kelly Holley-Bockelmann] was trying to run a Zoom meeting of the [Laser Interferometer Space Antenna] recently ‘when one of the members said his daughter was wondering what you call a collective of black holes—and then the meeting fell apart, with everyone trying to up one another,’ she said in an email. ‘Each time I saw a suggestion, I had to stop and giggle like a loon, which egged us all on more.’i”

ENVIRONMENT
Stopping Plastic in Rivers From Reaching the Ocean With New Tech From the Ocean Cleanup Project
Stephen Beacham | CNET
“First announced by Ocean Cleanup founder and CEO Boyan Slat in 2019, the Interceptors are moored to river beds and use the currents to snag debris floating on the surface. Then they direct the trash onto a conveyor belt that shuttles it into six large onboard dumpsters. The Interceptors run completely autonomously day and night, getting power from solar panels.”

FUTURE
Hackers Used to Be Humans. Soon, AIs Will Hack Humanity
Bruce Schneier | Wired
“Hacking is as old as humanity. We are creative problem solvers. We exploit loopholes, manipulate systems, and strive for more influence, power, and wealth. To date, hacking has exclusively been a human activity. Not for long. As I lay out in a report I just published, artificial intelligence will eventually find vulnerabilities in all sorts of social, economic, and political systems, and then exploit them at unprecedented speed, scale, and scope.”

Image Credit: NASA (Image of Martian sand dunes taken by NASA’s Curiosity rover) Continue reading

Posted in Human Robots

#439132 This Week’s Awesome Tech Stories From ...

ARTIFICIAL INTELLIGENCE
15 Graphs You Need to See to Understand AI in 2021
Charles Q. Choi | IEEE Spectrum
“If you haven’t had time to read the AI Index Report for 2021, which clocks in at 222 pages, don’t worry—we’ve got you covered. The massive document, produced by the Stanford Institute for Human-Centered Artificial Intelligence, is packed full of data and graphs, and we’ve plucked out 15 that provide a snapshot of the current state of AI.”

FUTURE
Geoffrey Hinton Has a Hunch About What’s Next for Artificial Intelligence
Siobhan Roberts | MIT Technology Review
“Back in November, the computer scientist and cognitive psychologist Geoffrey Hinton had a hunch. After a half-century’s worth of attempts—some wildly successful—he’d arrived at another promising insight into how the brain works and how to replicate its circuitry in a computer.”

ROBOTICS
Robotic Exoskeletons Could One Day Walk by Themselves
Charles Q. Choi | IEEE Spectrum
“Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system’s analysis of a user’s current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says.”

TECHNOLOGY
Microsoft Buys AI Speech Tech Company Nuance for $19.7 Billion
James Vincent | The Verge
“The $19.7 billion acquisition of Nuance is Microsoft’s second-largest behind its purchase of LinkedIn in 2016 for $26 billion. It comes at a time when speech tech is improving rapidly, thanks to the deep learning boom in AI, and there are simultaneously more opportunities for its use.”

ENVIRONMENT
Google’s New 3D Time-Lapse Feature Shows How Humans Are Affecting the Planet
Sam Rutherford | Gizmodo
“Described by Google Earth director Rebecca Moore as the biggest update to Google Earth since 2017, Timelapse in Google Earth combines more than 24 million satellite photos, two petabytes of data, and 2 million hours of CPU processing time to create a 4.4-terapixel interactive view showing how the Earth has changed from 1984 to 2020.”

GENETICS
The Genetic Mistakes That Could Shape Our Species
Zaria Gorvett | BBC
“New technologies may have already introduced genetic errors to the human gene pool. How long will they last? And how could they affect us? …According to [Stanford’s Hank] Greely, who has written a book about the implications of He [Jiankui]’s project, the answer depends on what the edits do and how they’re inherited.”

SPACE
The Era of Reusability in Space Has Begun
Eric Berger | Ars Technica
“As [Earth orbit] becomes more cluttered [due to falling launch costs], the responsible thing is to more actively refuel, recycle, and dispose of satellites. Northrop Grumman has made meaningful progress toward such a future of satellite servicing. As a result, reusability is now moving into space.”

COMPUTING
100 Million More IoT Devices Are Exposed—and They Won’t Be the Last
Lily Hay Newman | Wired
“Over the last few years, researchers have found a shocking number of vulnerabilities in seemingly basic code that underpins how devices communicate with the internet. Now a new set of nine such vulnerabilities are exposing an estimated 100 million devices worldwide, including an array of internet-of-things products and IT management servers.”

Image Credit: Naitian (Tony) Wang / Unsplash Continue reading

Posted in Human Robots

#439100 Video Friday: Robotic Eyeball Camera

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

And it's open source, so you can build your own!

[ Eyecam ]

Looks like Festo will be turning some of its bionic robots into educational kits, which is a pretty cool idea.

[ Bionics4Education ]

Underwater soft robots are challenging to model and control because of their high degrees of freedom and their intricate coupling with water. In this paper, we present a method that leverages the recent development in differentiable simulation coupled with a differentiable, analytical hydrodynamic model to assist with the modeling and control of an underwater soft robot. We apply this method to Starfish, a customized soft robot design that is easy to fabricate and intuitive to manipulate.

[ MIT CSAIL ]

Rainbow Robotics, the company who made HUBO, has a new collaborative robot arm.

[ Rainbow Robotics ]

Thanks Fan!

We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. The platform also demonstrates the potential of a novel manufacturing process, which incorporates adaptive and collaborative intelligence to improve the efficiency of mass customization for the factory of the future.

[ Paper ]

Thanks Poramate!

In Sevastopol State University the team of the Laboratory of Underwater Robotics and Control Systems and Research and Production Association “Android Technika” performed tests of an underwater anropomorphic manipulator robot.

[ Sevastopol State ]

Thanks Fan!

Taiwanese company TCI Gene created a COVID test system based on their fully automated and enclosed gene testing machine QVS-96S. The system includes two ABB robots and carries out 1800 tests per day, operating 24/7. Every hour 96 virus samples tests are made with an accuracy of 99.99%.

[ ABB ]

A short video showing how a Halodi Robotics can be used in a commercial guarding application.

[ Halodi ]

During the past five years, under the NASA Early Space Innovations program, we have been developing new design optimization methods for underactuated robot hands, aiming to achieve versatile manipulation in highly constrained environments. We have prototyped hands for NASA’s Astrobee robot, an in-orbit assistive free flyer for the International Space Station.

[ ROAM Lab ]

The new, improved OTTO 1500 is a workhorse AMR designed to move heavy payloads through demanding environments faster than any other AMR on the market, with zero compromise to safety.

[ ROAM Lab ]

Very, very high performance sensing and actuation to pull this off.

[ Ishikawa Group ]

We introduce a conversational social robot designed for long-term in-home use to help with loneliness. We present a novel robot behavior design to have simple self-reflection conversations with people to improve wellness, while still being feasible, deployable, and safe.

[ HCI Lab ]

We are one of the 5 winners of the Start-up Challenge. This video illustrates what we achieved during the Swisscom 5G exploration week. Our proof-of-concept tele-excavation system is composed of a Menzi Muck M545 walking excavator automated & customized by Robotic Systems Lab and IBEX motion platform as the operator station. The operator and remote machine are connected for the first time via a 5G network infrastructure which was brought to our test field by Swisscom.

[ RSL ]

This video shows LOLA balancing on different terrain when being pushed in different directions. The robot is technically blind, not using any camera-based or prior information on the terrain (hard ground is assumed).

[ TUM ]

Autonomous driving when you cannot see the road at all because it's buried in snow is some serious autonomous driving.

[ Norlab ]

A hierarchical and robust framework for learning bipedal locomotion is presented and successfully implemented on the 3D biped robot Digit. The feasibility of the method is demonstrated by successfully transferring the learned policy in simulation to the Digit robot hardware, realizing sustained walking gaits under external force disturbances and challenging terrains not included during the training process.

[ OSU ]

This is a video summary of the Center for Robot-Assisted Search and Rescue's deployments under the direction of emergency response agencies to more than 30 disasters in five countries from 2001 (9/11 World Trade Center) to 2018 (Hurricane Michael). It includes the first use of ground robots for a disaster (WTC, 2001), the first use of small unmanned aerial systems (Hurricane Katrina 2005), and the first use of water surface vehicles (Hurricane Wilma, 2005).

[ CRASAR ]

In March, a team from the Oxford Robotics Institute collected a week of epic off-road driving data, as part of the Sense-Assess-eXplain (SAX) project.

[ Oxford Robotics ]

As a part of the AAAI 2021 Spring Symposium Series, HEBI Robotics was invited to present an Industry Talk on the symposium's topic: Machine Learning for Mobile Robot Navigation in the Wild. Included in this presentation was a short case study on one of our upcoming mobile robots that is being designed to successfully navigate unstructured environments where today's robots struggle.

[ HEBI Robotics ]

Thanks Hardik!

This Lockheed Martin Robotics Seminar is from Chad Jenkins at the University of Michigan, on “Semantic Robot Programming… and Maybe Making the World a Better Place.”

I will present our efforts towards accessible and general methods of robot programming from the demonstrations of human users. Our recent work has focused on Semantic Robot Programming (SRP), a declarative paradigm for robot programming by demonstration that builds on semantic mapping. In contrast to procedural methods for motion imitation in configuration space, SRP is suited to generalize user demonstrations of goal scenes in workspace, such as for manipulation in cluttered environments. SRP extends our efforts to crowdsource robot learning from demonstration at scale through messaging protocols suited to web/cloud robotics. With such scaling of robotics in mind, prospects for cultivating both equal opportunity and technological excellence will be discussed in the context of broadening and strengthening Title IX and Title VI.

[ UMD ] Continue reading

Posted in Human Robots

#439089 Ingenuity’s Chief Pilot Explains How ...

On April 11, the Mars helicopter Ingenuity will take to the skies of Mars for the first time. It will do so fully autonomously, out of necessity—the time delay between Ingenuity’s pilots at the Jet Propulsion Laboratory and Jezero Crater on Mars makes manual or even supervisory control impossible. So the best that the folks at JPL can do is practice as much as they can in simulation, and then hope that the helicopter can handle everything on its own.

Here on Earth, simulation is a critical tool for many robotics applications, because it doesn’t rely on access to expensive hardware, is non-destructive, and can be run in parallel and at faster-than-real-time speeds to focus on solving specific problems. Once you think you’ve gotten everything figured out in simulation, you can always give it a try on the real robot and see how close you came. If it works in real life, great! And if not, well, you can tweak some stuff in the simulation and try again.

For the Mars helicopter, simulation is much more important, and much higher stakes. Testing the Mars helicopter under conditions matching what it’ll find on Mars is not physically possible on Earth. JPL has flown engineering models in Martian atmospheric conditions, and they’ve used an actuated tether to mimic Mars gravity, but there’s just no way to know what it’ll be like flying on Mars until they’ve actually flown on Mars. With that in mind, the Ingenuity team has been relying heavily on simulation, since that’s one of the best tools they have to prepare for their Martian flights. We talk with Ingenuity’s Chief Pilot, Håvard Grip, to learn how it all works.

Ingenuity Facts:
Body Size: a box of tissues

Brains: Qualcomm Snapdragon 801

Weight: 1.8 kilograms

Propulsion: Two 1.2m carbon fiber rotors

Navigation sensors: VGA camera, laser altimeter, inclinometer

Ingenuity is scheduled to make its first flight no earlier than April 11. Before liftoff, the Ingenuity team will conduct a variety of pre-flight checks, including verifying the responsiveness of the control system and spinning the blades up to full speed (2,537 rpm) without lifting off. If everything looks good, the first flight will consist of a 1 meter per second climb to 3 meters, 30 seconds of hover at 3 meters while rotating in place a bit, and then a descent to landing. If Ingenuity pulls this off, that will have made its entire mission a success. There will be more flights over the next few weeks, but all it takes is one to prove that autonomous helicopter flight on Mars is possible.

Last month, we spoke with Mars Helicopter Operations Lead Tim Canham about Ingenuity’s hardware, software, and autonomy, but we wanted to know more about how the Ingenuity team has been using simulation for everything from vehicle design to flight planning. To answer our questions, we talked with JPL’s Håvard Grip, who led the development of Ingenuity’s navigation and flight control systems. Grip also has the title of Ingenuity Chief Pilot, which is pretty awesome. He summarizes this role as “operating the flight control system to make the helicopter do what we want it to do.”

IEEE Spectrum: Can you tell me about the simulation environment that JPL uses for Ingenuity’s flight planning?

Håvard Grip: We developed a Mars helicopter simulation ourselves at JPL, based on a multi-body simulation framework that’s also developed at JPL, called DARTS/DSHELL. That's a system that has been in development at JPL for about 30 years now, and it's been used in a number of missions. And so we took that multibody simulation framework, and based on it we built our own Mars helicopter simulation, put together our own rotor model, our own aerodynamics models, and everything else that's needed in order to simulate a helicopter. We also had a lot of help from the rotorcraft experts at NASA Ames and NASA Langley.

Image: NASA/JPL

Ingenuity in JPL’s flight simulator.

Without being able to test on Mars, how much validation are you able to do of what you’re seeing in simulation?

We can do a fair amount, but it requires a lot of planning. When we made our first real prototype (with a full-size rotor that looked like what we were thinking of putting on Mars) we first spent a lot of time designing it and using simulation tools to guide that design, and when we were sufficiently confident that we were close enough, and that we understood enough about it, then we actually built the thing and designed a whole suite of tests in a vacuum chamber where where we could replicate Mars atmospheric conditions. And those tests were before we tried to fly the helicopter—they were specifically targeted at what we call system identification, which has to do with figuring out what the true properties, the true dynamics of a system are, compared to what we assumed in our models. So then we got to see how well our models did, and in the places where they needed adjustment, we could go back and do that.

The simulation work that we really started after that very first initial lift test, that’s what allowed us to unlock all of the secrets to building a helicopter that can fly on Mars.
—Håvard Grip, Ingenuity Chief Pilot

We did a lot of this kind of testing. It was a big campaign, in several stages. But there are of course things that you can't fully replicate, and you do depend on simulation to tie things together. For example, we can't truly replicate Martian gravity on Earth. We can replicate the atmosphere, but not the gravity, and so we have to do various things when we fly—either make the helicopter very light, or we have to help it a little bit by pulling up on it with a string to offload some of the weight. These things don't fully replicate what it will be like on Mars. We also can't simultaneously replicate the Mars aerodynamic environment and the physical and visual surroundings that the helicopter will be flying in. These are places where simulation tools definitely come in handy, with the ability to do full flight tests from A to B, with the helicopter taking off from the ground, running the flight software that it will be running on board, simulating the images that the navigation camera takes of the ground below as it flies, feeding that back into the flight software, and then controlling it.

To what extent can simulation really compensate for the kinds of physical testing that you can’t do on Earth?

It gives you a few different possibilities. We can take certain tests on Earth where we replicate key elements of the environment, like the atmosphere or the visual surroundings for example, and you can validate your simulation on those parameters that you can test on Earth. Then, you can combine those things in simulation, which gives you the ability to set up arbitrary scenarios and do lots and lots of tests. We can Monte Carlo things, we can do a flight a thousand times in a row, with small perturbations of various parameters and tease out what our sensitivities are to those things. And those are the kinds of things that you can't do with physical tests, both because you can't fully replicate the environment and also because of the resources that would be required to do the same thing a thousand times in a row.

Because there are limits to the physical testing we can do on Earth, there are elements where we know there's more uncertainty. On those aspects where the uncertainty is high, we tried to build in enough margin that we can handle a range of things. And simulation gives you the ability to then maybe play with those parameters, and put them at their outer limits, and test them beyond where the real parameters are going to be to make sure that you have robustness even in those extreme cases.

How do you make sure you’re not relying on simulation too much, especially since in some ways it’s your only option?

It’s about anchoring it in real data, and we’ve done a lot of that with our physical testing. I think what you’re referring to is making your simulation too perfect, and we’re careful to model the things that matter. For example, the simulated sensors that we use have realistic levels of simulated noise and bias in them, the navigation camera images have realistic levels of degradation, we have realistic disturbances from wind gusts. If you don’t properly account for those things, then you’re missing important details. So, we try to be as accurate as we can, and to capture that by overbounding in areas where we have a high degree of uncertainty.

What kinds of simulated challenges have you put the Mars helicopter through, and how do you decide how far to push those challenges?

One example is that we can simulate going over rougher terrain. We can push that, and see how far we can go and still have the helicopter behave the way that we want it to. Or we can inject levels of noise that maybe the real sensors don't see, but you want to just see how far you can push things and make sure that it's still robust.

Where we put the limits on this and what we consider to be realistic is often a challenge. We consider this on a case by case basis—if you have a sensor that you're dealing with, you try to do testing with it to characterize it and understand its performance as much as possible, and you build a level of confidence in it that allows you to find the proper balance.

When it comes to things like terrain roughness, it's a little bit of a different thing, because we're actually picking where we're flying the helicopter. We have made that choice, and we know what the terrain looks like around us, so we don’t have to wonder about that anymore.

Image: NASA/JPL-Caltech/University of Arizona

Satellite image of the Ingenuity flight area.

The way that we’re trying to approach this operationally is that we should be done with the engineering at this point. We’re not depending on going back and resimulating things, other than a few checks here and there.

Are there any examples of things you learned as part of the simulation process that resulted in changes to the hardware or mission?

You know, it’s been a journey. One of the early things that we discovered as part of modeling the helicopter was that the rotor dynamics were quite different for a helicopter on Mars, in particular with respect to how the rotor responds to the up and down bending of the blades because they’re not perfectly rigid. That motion is a very important influence on the overall flight dynamics of the helicopter, and what we discovered as we started modeling was that this motion is damped much less on Mars. Under-damped oscillatory things like that, you kind of figure might pose a control issue, and that is the case here: if you just naively design it as you might a helicopter on Earth, without taking this into account, you could have a system where the response to control inputs becomes very sluggish. So that required changes to the vehicle design from some of the very early concepts, and it led us to make a rotor that’s extremely light and rigid.

The design cycle for the Mars helicopter—it’s not like we could just build something and take it out to the back yard and try it and then come back and tweak it if it doesn’t work. It’s a much bigger effort to build something and develop a test program where you have to use a vacuum chamber to test it. So you really want to get as close as possible up front, on your first iteration, and not have to go back to the drawing board on the basic things.

So how close were you able to get on your first iteration of the helicopter design?

[This video shows] a very early demo which was done more or less just assuming that things were going to behave as they would on Earth, and that we’d be able to fly in a Martian atmosphere just spinning the rotor faster and having a very light helicopter. We were basically just trying to demonstrate that we could produce enough lift. You can see the helicopter hopping around, with someone trying to joystick it, but it turned out to be very hard to control. This was prior to doing any of the modeling that I talked about earlier. But once we started seriously focusing on the modeling and simulation, we then went on to build a prototype vehicle which had a full-size rotor that’s very close to the rotor that will be flying on Mars. One difference is that prototype had cyclic control only on the lower rotor, and later we added cyclic control on the upper rotor as well, and that decision was informed in large part by the work we did in simulation—we’d put in the kinds of disturbances that we thought we might see on Mars, and decided that we needed to have the extra control authority.

How much room do you think there is for improvement in simulation, and how could that help you in the future?

The tools that we have were definitely sufficient for doing the job that we needed to do in terms of building a helicopter that can fly on Mars. But simulation is a compute-intensive thing, and so I think there’s definitely room for higher fidelity simulation if you have the compute power to do so. For a future Mars helicopter, you could get some benefits by more closely coupling together high-fidelity aerodynamic models with larger multi-body models, and doing that in a fast way, where you can iterate quickly. There’s certainly more potential for optimizing things.

Photo: NASA/JPL-Caltech

Ingenuity preparing for flight.

Watching Ingenuity’s first flight take place will likely be much like watching the Perseverance landing—we’ll be able to follow along with the Ingenuity team while they send commands to the helicopter and receive data back, although the time delay will mean that any kind of direct control won’t be possible. If everything goes the way it’s supposed to, there will hopefully be some preliminary telemetry from Ingenuity saying so, but it sounds like we’ll likely have to wait until April 12 before we get pictures or video of the flight itself.

Because Mars doesn’t care what time it is on Earth, the flight will actually be taking place very early on April 12, with the JPL Mission Control livestream starting at 3:30 a.m. EDT (12:30 a.m. PDT). Details are here. Continue reading

Posted in Human Robots