Tag Archives: lift

#439089 Ingenuity’s Chief Pilot Explains How ...

On April 11, the Mars helicopter Ingenuity will take to the skies of Mars for the first time. It will do so fully autonomously, out of necessity—the time delay between Ingenuity’s pilots at the Jet Propulsion Laboratory and Jezero Crater on Mars makes manual or even supervisory control impossible. So the best that the folks at JPL can do is practice as much as they can in simulation, and then hope that the helicopter can handle everything on its own.

Here on Earth, simulation is a critical tool for many robotics applications, because it doesn’t rely on access to expensive hardware, is non-destructive, and can be run in parallel and at faster-than-real-time speeds to focus on solving specific problems. Once you think you’ve gotten everything figured out in simulation, you can always give it a try on the real robot and see how close you came. If it works in real life, great! And if not, well, you can tweak some stuff in the simulation and try again.

For the Mars helicopter, simulation is much more important, and much higher stakes. Testing the Mars helicopter under conditions matching what it’ll find on Mars is not physically possible on Earth. JPL has flown engineering models in Martian atmospheric conditions, and they’ve used an actuated tether to mimic Mars gravity, but there’s just no way to know what it’ll be like flying on Mars until they’ve actually flown on Mars. With that in mind, the Ingenuity team has been relying heavily on simulation, since that’s one of the best tools they have to prepare for their Martian flights. We talk with Ingenuity’s Chief Pilot, Håvard Grip, to learn how it all works.

Ingenuity Facts:
Body Size: a box of tissues

Brains: Qualcomm Snapdragon 801

Weight: 1.8 kilograms

Propulsion: Two 1.2m carbon fiber rotors

Navigation sensors: VGA camera, laser altimeter, inclinometer

Ingenuity is scheduled to make its first flight no earlier than April 11. Before liftoff, the Ingenuity team will conduct a variety of pre-flight checks, including verifying the responsiveness of the control system and spinning the blades up to full speed (2,537 rpm) without lifting off. If everything looks good, the first flight will consist of a 1 meter per second climb to 3 meters, 30 seconds of hover at 3 meters while rotating in place a bit, and then a descent to landing. If Ingenuity pulls this off, that will have made its entire mission a success. There will be more flights over the next few weeks, but all it takes is one to prove that autonomous helicopter flight on Mars is possible.

Last month, we spoke with Mars Helicopter Operations Lead Tim Canham about Ingenuity’s hardware, software, and autonomy, but we wanted to know more about how the Ingenuity team has been using simulation for everything from vehicle design to flight planning. To answer our questions, we talked with JPL’s Håvard Grip, who led the development of Ingenuity’s navigation and flight control systems. Grip also has the title of Ingenuity Chief Pilot, which is pretty awesome. He summarizes this role as “operating the flight control system to make the helicopter do what we want it to do.”

IEEE Spectrum: Can you tell me about the simulation environment that JPL uses for Ingenuity’s flight planning?

Håvard Grip: We developed a Mars helicopter simulation ourselves at JPL, based on a multi-body simulation framework that’s also developed at JPL, called DARTS/DSHELL. That's a system that has been in development at JPL for about 30 years now, and it's been used in a number of missions. And so we took that multibody simulation framework, and based on it we built our own Mars helicopter simulation, put together our own rotor model, our own aerodynamics models, and everything else that's needed in order to simulate a helicopter. We also had a lot of help from the rotorcraft experts at NASA Ames and NASA Langley.

Image: NASA/JPL

Ingenuity in JPL’s flight simulator.

Without being able to test on Mars, how much validation are you able to do of what you’re seeing in simulation?

We can do a fair amount, but it requires a lot of planning. When we made our first real prototype (with a full-size rotor that looked like what we were thinking of putting on Mars) we first spent a lot of time designing it and using simulation tools to guide that design, and when we were sufficiently confident that we were close enough, and that we understood enough about it, then we actually built the thing and designed a whole suite of tests in a vacuum chamber where where we could replicate Mars atmospheric conditions. And those tests were before we tried to fly the helicopter—they were specifically targeted at what we call system identification, which has to do with figuring out what the true properties, the true dynamics of a system are, compared to what we assumed in our models. So then we got to see how well our models did, and in the places where they needed adjustment, we could go back and do that.

The simulation work that we really started after that very first initial lift test, that’s what allowed us to unlock all of the secrets to building a helicopter that can fly on Mars.
—Håvard Grip, Ingenuity Chief Pilot

We did a lot of this kind of testing. It was a big campaign, in several stages. But there are of course things that you can't fully replicate, and you do depend on simulation to tie things together. For example, we can't truly replicate Martian gravity on Earth. We can replicate the atmosphere, but not the gravity, and so we have to do various things when we fly—either make the helicopter very light, or we have to help it a little bit by pulling up on it with a string to offload some of the weight. These things don't fully replicate what it will be like on Mars. We also can't simultaneously replicate the Mars aerodynamic environment and the physical and visual surroundings that the helicopter will be flying in. These are places where simulation tools definitely come in handy, with the ability to do full flight tests from A to B, with the helicopter taking off from the ground, running the flight software that it will be running on board, simulating the images that the navigation camera takes of the ground below as it flies, feeding that back into the flight software, and then controlling it.

To what extent can simulation really compensate for the kinds of physical testing that you can’t do on Earth?

It gives you a few different possibilities. We can take certain tests on Earth where we replicate key elements of the environment, like the atmosphere or the visual surroundings for example, and you can validate your simulation on those parameters that you can test on Earth. Then, you can combine those things in simulation, which gives you the ability to set up arbitrary scenarios and do lots and lots of tests. We can Monte Carlo things, we can do a flight a thousand times in a row, with small perturbations of various parameters and tease out what our sensitivities are to those things. And those are the kinds of things that you can't do with physical tests, both because you can't fully replicate the environment and also because of the resources that would be required to do the same thing a thousand times in a row.

Because there are limits to the physical testing we can do on Earth, there are elements where we know there's more uncertainty. On those aspects where the uncertainty is high, we tried to build in enough margin that we can handle a range of things. And simulation gives you the ability to then maybe play with those parameters, and put them at their outer limits, and test them beyond where the real parameters are going to be to make sure that you have robustness even in those extreme cases.

How do you make sure you’re not relying on simulation too much, especially since in some ways it’s your only option?

It’s about anchoring it in real data, and we’ve done a lot of that with our physical testing. I think what you’re referring to is making your simulation too perfect, and we’re careful to model the things that matter. For example, the simulated sensors that we use have realistic levels of simulated noise and bias in them, the navigation camera images have realistic levels of degradation, we have realistic disturbances from wind gusts. If you don’t properly account for those things, then you’re missing important details. So, we try to be as accurate as we can, and to capture that by overbounding in areas where we have a high degree of uncertainty.

What kinds of simulated challenges have you put the Mars helicopter through, and how do you decide how far to push those challenges?

One example is that we can simulate going over rougher terrain. We can push that, and see how far we can go and still have the helicopter behave the way that we want it to. Or we can inject levels of noise that maybe the real sensors don't see, but you want to just see how far you can push things and make sure that it's still robust.

Where we put the limits on this and what we consider to be realistic is often a challenge. We consider this on a case by case basis—if you have a sensor that you're dealing with, you try to do testing with it to characterize it and understand its performance as much as possible, and you build a level of confidence in it that allows you to find the proper balance.

When it comes to things like terrain roughness, it's a little bit of a different thing, because we're actually picking where we're flying the helicopter. We have made that choice, and we know what the terrain looks like around us, so we don’t have to wonder about that anymore.

Image: NASA/JPL-Caltech/University of Arizona

Satellite image of the Ingenuity flight area.

The way that we’re trying to approach this operationally is that we should be done with the engineering at this point. We’re not depending on going back and resimulating things, other than a few checks here and there.

Are there any examples of things you learned as part of the simulation process that resulted in changes to the hardware or mission?

You know, it’s been a journey. One of the early things that we discovered as part of modeling the helicopter was that the rotor dynamics were quite different for a helicopter on Mars, in particular with respect to how the rotor responds to the up and down bending of the blades because they’re not perfectly rigid. That motion is a very important influence on the overall flight dynamics of the helicopter, and what we discovered as we started modeling was that this motion is damped much less on Mars. Under-damped oscillatory things like that, you kind of figure might pose a control issue, and that is the case here: if you just naively design it as you might a helicopter on Earth, without taking this into account, you could have a system where the response to control inputs becomes very sluggish. So that required changes to the vehicle design from some of the very early concepts, and it led us to make a rotor that’s extremely light and rigid.

The design cycle for the Mars helicopter—it’s not like we could just build something and take it out to the back yard and try it and then come back and tweak it if it doesn’t work. It’s a much bigger effort to build something and develop a test program where you have to use a vacuum chamber to test it. So you really want to get as close as possible up front, on your first iteration, and not have to go back to the drawing board on the basic things.

So how close were you able to get on your first iteration of the helicopter design?

[This video shows] a very early demo which was done more or less just assuming that things were going to behave as they would on Earth, and that we’d be able to fly in a Martian atmosphere just spinning the rotor faster and having a very light helicopter. We were basically just trying to demonstrate that we could produce enough lift. You can see the helicopter hopping around, with someone trying to joystick it, but it turned out to be very hard to control. This was prior to doing any of the modeling that I talked about earlier. But once we started seriously focusing on the modeling and simulation, we then went on to build a prototype vehicle which had a full-size rotor that’s very close to the rotor that will be flying on Mars. One difference is that prototype had cyclic control only on the lower rotor, and later we added cyclic control on the upper rotor as well, and that decision was informed in large part by the work we did in simulation—we’d put in the kinds of disturbances that we thought we might see on Mars, and decided that we needed to have the extra control authority.

How much room do you think there is for improvement in simulation, and how could that help you in the future?

The tools that we have were definitely sufficient for doing the job that we needed to do in terms of building a helicopter that can fly on Mars. But simulation is a compute-intensive thing, and so I think there’s definitely room for higher fidelity simulation if you have the compute power to do so. For a future Mars helicopter, you could get some benefits by more closely coupling together high-fidelity aerodynamic models with larger multi-body models, and doing that in a fast way, where you can iterate quickly. There’s certainly more potential for optimizing things.

Photo: NASA/JPL-Caltech

Ingenuity preparing for flight.

Watching Ingenuity’s first flight take place will likely be much like watching the Perseverance landing—we’ll be able to follow along with the Ingenuity team while they send commands to the helicopter and receive data back, although the time delay will mean that any kind of direct control won’t be possible. If everything goes the way it’s supposed to, there will hopefully be some preliminary telemetry from Ingenuity saying so, but it sounds like we’ll likely have to wait until April 12 before we get pictures or video of the flight itself.

Because Mars doesn’t care what time it is on Earth, the flight will actually be taking place very early on April 12, with the JPL Mission Control livestream starting at 3:30 a.m. EDT (12:30 a.m. PDT). Details are here. Continue reading

Posted in Human Robots

#439036 Video Friday: Shadow Plays Jenga, and ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The Shadow Robot team couldn't resist! Our Operator, Joanna, is using the Shadow Teleoperation System which, fun and games aside, can help those in difficult, dangerous and distant jobs.

Shadow could challenge this MIT Jenga-playing robot, but I bet they wouldn't win:

[ Shadow Robot ]

Digit is gradually stomping the Agility Robotics logo into a big grassy field fully autonomously.

[ Agility Robotics ]

This is a pretty great and very short robotic magic show.

[ Mario the Magician ]

A research team at the Georgia Institute of Technology has developed a modular solution for drone delivery of larger packages without the need for a complex fleet of drones of varying sizes. By allowing teams of small drones to collaboratively lift objects using an adaptive control algorithm, the strategy could allow a wide range of packages to be delivered using a combination of several standard-sized vehicles.

[ GA Tech ]

I've seen this done using vision before, but Flexiv's Rizon 4s can keep a ball moving along a specific trajectory using only force sensing and control.

[ Flexiv ]

Thanks Yunfan!

This combination of a 3D aerial projection system and a sensing interface can be used as an interactive and intuitive control system for things like robot arms, but in this case, it's being used to make simulated pottery. Much less messy than the traditional way of doing it.

More details on Takafumi Matsumaru's work at the Bio-Robotics & Human-Mechatronics Laboratory at Waseda University at the link below.

[ BLHM ]

U.S. Vice President Kamala Harris called astronauts Shannon Walker and Kate Rubins on the ISS, and they brought up Astrobee, at which point Shannon reaches over and rips Honey right off of her charging dock to get her on camera.

[ NASA ]

Here's a quick three minute update on Perseverance and Ingenuity from JPL.

[ Mars 2020 ]

Rigid grippers used in existing aerial manipulators require precise positioning to achieve successful grasps and transmit large contact forces that may destabilize the drone. This limits the speed during grasping and prevents “dynamic grasping,” where the drone attempts to grasp an object while moving. On the other hand, biological systems (e.g. birds) rely on compliant and soft parts to dampen contact forces and compensate for grasping inaccuracy, enabling impressive feats. This paper presents the first prototype of a soft drone—a quadrotor where traditional (i.e. rigid) landing gears are replaced with a soft tendon-actuated gripper to enable aggressive grasping.

[ MIT ]

In this video we present results from a field deployment inside the Løkken Mine underground pyrite mine in Norway. The Løkken mine was operative from 1654 to 1987 and contains narrow but long corridors, alongside vast rooms and challenging vertical stopes. In this field study we evaluated selected autonomous exploration and visual search capabilities of a subset of the aerial robots of Team CERBERUS towards the goal of complete subterranean autonomy.

[ Team CERBERUS ]

What you can do with a 1,000 FPS projector with a high speed tracking system.

[ Ishikawa Group ]

ANYbotics’ collaboration with BASF, one of the largest global chemical manufacturers, displays the efficiency, quality, and scalability of robotic inspection and data-collection capabilities in complex industrial environments.

[ ANYbotics ]

Does your robot arm need a stylish jacket?

[ Fraunhofer ]

Trossen Robotics unboxes a Unitree A1, and it's actually an unboxing where they have to figure out everything from scratch.

[ Trossen ]

Robots have learned to drive cars, assist in surgeries―and vacuum our floors. But can they navigate the unwritten rules of a busy sidewalk? Until they can, robotics experts Leila Takayama and Chris Nicholson believe, robots won’t be able to fulfill their immense potential. In this conversation, Chris and Leila explore the future of robotics and the role open source will play in it.

[ Red Hat ]

Christoph Bartneck's keynote at the 6th Joint UAE Symposium on Social Robotics, focusing on what roles robots can play during the Covid crisis and why so many social robots fail in the market.

[ HIT Lab ]

Decision-making based on arbitrary criteria is legal in some contexts, such as employment, and not in others, such as criminal sentencing. As algorithms replace human deciders, HAI-EIS fellow Kathleen Creel argues arbitrariness at scale is morally and legally problematic. In this HAI seminar, she explains how the heart of this moral issue relates to domination and a lack of sufficient opportunity for autonomy. It relates in interesting ways to the moral wrong of discrimination. She proposes technically informed solutions that can lessen the impact of algorithms at scale and so mitigate or avoid the moral harm identified.

[ Stanford HAI ]

Sawyer B. Fuller speaks on Autonomous Insect-Sized Robots at the UC Berkeley EECS Colloquium series.

Sub-gram (insect-sized) robots have enormous potential that is largely untapped. From a research perspective, their extreme size, weight, and power (SWaP) constraints also forces us to reimagine everything from how they compute their control laws to how they are fabricated. These questions are the focus of the Autonomous Insect Robotics Laboratory at the University of Washington. I will discuss potential applications for insect robots and recent advances from our group. These include the first wireless flights of a sub-gram flapping-wing robot that weighs barely more than a toothpick. I will describe efforts to expand its capabilities, including the first multimodal ground-flight locomotion, the first demonstration of steering control, and how to find chemical plume sources by integrating the smelling apparatus of a live moth. I will also describe a backpack for live beetles with a steerable camera and conceptual design of robots that could scale all the way down to the “gnat robots” first envisioned by Flynn & Brooks in the ‘80s.

[ UC Berkeley ]

Thanks Fan!

Joshua Vander Hook, Computer Scientist, NIAC Fellow, and Technical Group Supervisor at NASA JPL, presents an overview of the AI Group(s) at JPL, and recent work on single and multi-agent autonomous systems supporting space exploration, Earth science, NASA technology development, and national defense programs.

[ UMD ] Continue reading

Posted in Human Robots

#439023 In ‘Klara and the Sun,’ We Glimpse ...

In a store in the center of an unnamed city, humanoid robots are displayed alongside housewares and magazines. They watch the fast-moving world outside the window, anxiously awaiting the arrival of customers who might buy them and take them home. Among them is Klara, a particularly astute robot who loves the sun and wants to learn as much as possible about humans and the world they live in.

So begins Kazuo Ishiguro’s new novel Klara and the Sun, published earlier this month. The book, told from Klara’s perspective, portrays an eerie future society in which intelligent machines and other advanced technologies have been integrated into daily life, but not everyone is happy about it.

Technological unemployment, the progress of artificial intelligence, inequality, the safety and ethics of gene editing, increasing loneliness and isolation—all of which we’re grappling with today—show up in Ishiguro’s world. It’s like he hit a fast-forward button, mirroring back to us how things might play out if we don’t approach these technologies with caution and foresight.

The wealthy genetically edit or “lift” their children to set them up for success, while the poor have to make do with the regular old brains and bodies bequeathed them by evolution. Lifted and unlifted kids generally don’t mix, and this is just one of many sinister delineations between a new breed of haves and have-nots.

There’s anger about robots’ steady infiltration into everyday life, and questions about how similar their rights should be to those of humans. “First they take the jobs. Then they take the seats at the theater?” one woman fumes.

References to “changes” and “substitutions” allude to an economy where automation has eliminated millions of jobs. While “post-employed” people squat in abandoned buildings and fringe communities arm themselves in preparation for conflict, those whose livelihoods haven’t been destroyed can afford to have live-in housekeepers and buy Artificial Friends (or AFs) for their lonely children.

“The old traditional model that we still live with now—where most of us can get some kind of paid work in exchange for our services or the goods we make—has broken down,” Ishiguro said in a podcast discussion of the novel. “We’re not talking just about the difference between rich and poor getting bigger. We’re talking about a gap appearing between people who participate in society in an obvious way and people who do not.”

He has a point; as much as techno-optimists claim that the economic changes brought by automation and AI will give us all more free time, let us work less, and devote time to our passion projects, how would that actually play out? What would millions of “post-employed” people receiving basic income actually do with their time and energy?

In the novel, we don’t get much of a glimpse of this side of the equation, but we do see how the wealthy live. After a long wait, just as the store manager seems ready to give up on selling her, Klara is chosen by a 14-year-old girl named Josie, the daughter of a woman who wears “high-rank clothes” and lives in a large, sunny home outside the city. Cheerful and kind, Josie suffers from an unspecified illness that periodically flares up and leaves her confined to her bed for days at a time.

Her life seems somewhat bleak, the need for an AF clear. In this future world, the children of the wealthy no longer go to school together, instead studying alone at home on their digital devices. “Interaction meetings” are set up for them to learn to socialize, their parents carefully eavesdropping from the next room and trying not to intervene when there’s conflict or hurt feelings.

Klara does her best to be a friend, aide, and confidante to Josie while continuing to learn about the world around her and decode the mysteries of human behavior. We surmise that she was programmed with a basic ability to understand emotions, which evolves along with her other types of intelligence. “I believe I have many feelings. The more I observe, the more feelings become available to me,” she explains to one character.

Ishiguro does an excellent job of representing Klara’s mind: a blend of pre-determined programming, observation, and continuous learning. Her narration has qualities both robotic and human; we can tell when something has been programmed in—she “Gives Privacy” to the humans around her when that’s appropriate, for example—and when she’s figured something out for herself.

But the author maintains some mystery around Klara’s inner emotional life. “Does she actually understand human emotions, or is she just observing human emotions and simulating them within herself?” he said. “I suppose the question comes back to, what are our emotions as human beings? What do they amount to?”

Klara is particularly attuned to human loneliness, since she essentially was made to help prevent it. It is, in her view, peoples’ biggest fear, and something they’ll go to great lengths to avoid, yet can never fully escape. “Perhaps all humans are lonely,” she says.

Warding off loneliness through technology isn’t a futuristic idea, it’s something we’ve been doing for a long time, with the technologies at hand growing more and more sophisticated. Products like AFs already exist. There’s XiaoIce, a chatbot that uses “sentiment analysis” to keep its 660 million users engaged, and Azuma Hikari, a character-based AI designed to “bring comfort” to users whose lives lack emotional connection with other humans.

The mere existence of these tools would be sinister if it wasn’t for their widespread adoption; when millions of people use AIs to fill a void in their lives, it raises deeper questions about our ability to connect with each other and whether technology is building it up or tearing it down.

This isn’t the only big question the novel tackles. An overarching theme is one we’ve been increasingly contemplating as computers start to acquire more complex capabilities, like the beginnings of creativity or emotional awareness: What is it that truly makes us human?

“Do you believe in the human heart?” one character asks. “I don’t mean simply the organ, obviously. I’m speaking in the poetic sense. The human heart. Do you think there is such a thing? Something that makes each of us special and individual?”

The alternative, at least in the story, is that people don’t have a unique essence, but rather we’re all a blend of traits and personalities that can be reduced to strings of code. Our understanding of the brain is still elementary, but at some level, doesn’t all human experience boil down to the firing of billions of neurons between our ears? Will we one day—in a future beyond that painted by Ishiguro, but certainly foreshadowed by it—be able to “decode” our humanity to the point that there’s nothing mysterious left about it? “A human heart is bound to be complex,” Klara says. “But it must be limited.”

Whether or not you agree, Klara and the Sun is worth the read. It’s both a marvelous, engaging story about what it means to love and be human, and a prescient warning to approach technological change with caution and nuance. We’re already living in a world where AI keeps us company, influences our behavior, and is wreaking various forms of havoc. Ishiguro’s novel is a snapshot of one of our possible futures, told through the eyes of a robot who keeps you rooting for her to the end.

Image Credit: Marion Wellmann from Pixabay Continue reading

Posted in Human Robots

#437845 Video Friday: Harmonic Bionics ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRA 2020 – May 31-August 31, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.

Designed to protect employees and passengers from both harmful pathogens and cleaning agents, Breezy One can quickly, safely and effectively decontaminate spaces over 100,000 square feet in 1.5 hours with a patented, environmentally safe disinfectant. Breezy One was co-developed with the City of Albuquerque’s Aviation Department, where it autonomously sanitizes the Sunport’s facilities every night in the ongoing fight against COVID-19.

[ Fetch Robotics ]

Harmonic Bionics is redefining upper extremity neurorehabilitation with intelligent robotic technology designed to maximize patient recovery. Harmony SHR, our flagship product, works with a patient’s scapulohumeral rhythm (SHR) to enable natural, comprehensive therapy for both arms. When combined with Harmony’s Weight Support mode, this unique shoulder design may allow for earlier initiation of post-stroke therapy as Harmony can support a partial dislocation or subluxation of the shoulder prior to initiating traditional therapy exercises.

Harmony's Preprogrammed Exercises promotes functional treatment through patient-specific movements that can enable an increased number of repetitions per session without placing a larger physical burden on therapists or their resources. As the only rehabilitation exoskeleton with Bilateral Sync Therapy (BST), Harmony enables intent-based therapy by registering healthy arm movements and synchronizing that motion onto the stroke-affected side to help reestablish neural pathways.

[ Harmonic Bionics ]

Thanks Mok!

Some impressive work here from IHMC and IIT getting Atlas to take steps upward in a way that’s much more human-like than robot-like, which ends up reducing maximum torque requirements by 20 percent.

[ Paper ]

GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.

[ GITAI ]

Malloy Aeronautics, which now makes drones rather than hoverbikes, has been working with the Royal Navy in New Zealand to figure out how to get cargo drones to land on ships.

The challenge was to test autonomous landing of heavy lift UAVs on a moving ship, however, due to the Covid19 lockdown no ship trails were possible. The moving deck was simulated by driving a vehicle and trailer across an airfield while carrying out multiple landing and take-offs. The autonomous system partner was Planck Aerosystems and autolanding was triggered by a camera on the UAV reading a QR code on the trailer.

[ Malloy Aeronautics ]

Thanks Paul!

Tertill looks to be relentlessly effective.

[ Franklin Robotics ]

A Swedish company, TikiSafety has experienced a record amount of orders for their protective masks. At ABB, we are grateful for the opportunity to help Tiki Safety to speed up their manufacturing process from 6 minutes to 40 seconds.

[ Tiki Safety ]

The Korea Atomic Energy Research Institute is not messing around with ARMstrong, their robot for nuclear and radiation emergency response.

[ KAERI ]

OMOY is a robot that communicates with its users via internal weight shifting.

[ Paper ]

Now this, this is some weird stuff.

[ Segway ]

CaTARo is a Care Training Assistant Robot from the AIS Lab at Ritsumeikan University.

[ AIS Lab ]

Originally launched in 2015 to assist workers in lightweight assembly tasks, ABB’s collaborative YuMi robot has gone on to blaze a trail in a raft of diverse applications and industries, opening new opportunities and helping to fire people’s imaginations about what can be achieved with robotic automation.

[ ABB ]

This music video features COMAN+, from the Humanoids and Human Centered Mechatronics Lab at IIT, doing what you’d call dance moves if you dance like I do.

[ Alex Braga ] via [ IIT ]

The NVIDIA Isaac Software Development Kit (SDK) enables accelerated AI robot development workflows. Stacked with new tools and application support, Isaac SDK 2020.1 is an end-to-end solution supporting each step of robot fleet deployment, from design collaboration and training to the ongoing maintenance of AI applications.

[ NVIDIA ]

Robot Spy Komodo Dragon and Spy Pig film “a tender moment” between Komodo dragons but will they both survive the encounter?

[ BBC ] via [ Laughing Squid ]

This is part one of a mostly excellent five-part documentary about ROS produced by Red Hat. I say mostly only because they put ME in it for some reason, but fortunately, they talked with many of the core team that developed ROS back at Willow Garage back in the day, and it’s definitely worth watching.

[ Red Hat Open Source Stories ]

It’s been a while, but here’s an update on SRI’s Abacus Drive, from Alexander Kernbaum.

[ SRI ]

This Robots For Infectious Diseases interview features IEEE Fellow Antonio Bicchi, professor of robotics at the University of Pisa, talking about how Italy has been using technology to help manage COVID-19.

[ R4ID ]

Two more interviews this week of celebrity roboticists from MassRobotics: Helen Greiner and Marc Raibert. I’d introduce them, but you know who they are already!

[ MassRobotics ] Continue reading

Posted in Human Robots

#437805 Video Friday: Quadruped Robot HyQ ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Four-legged HyQ balancing on two legs. Nice results from the team at IIT’s Dynamic Legged Systems Lab. And we can’t wait to see the “ninja walk,” currently shown in simulation, implemented with the real robot!

The development of balance controllers for legged robots with point feet remains a challenge when they have to traverse extremely constrained environments. We present a balance controller that has the potential to achieve line walking for quadruped robots. Our initial experiments show the 90-kg robot HyQ balancing on two feet and recovering from external pushes, as well as some changes in posture achieved without losing balance.

[ IIT ]

Thanks Victor!

Ava Robotics’ telepresence robot has been beheaded by MIT, and it now sports a coronavirus-destroying UV array.

UV-C light has proven to be effective at killing viruses and bacteria on surfaces and aerosols, but it’s unsafe for humans to be exposed. Fortunately, Ava’s telepresence robot doesn’t require any human supervision. Instead of the telepresence top, the team subbed in a UV-C array for disinfecting surfaces. Specifically, the array uses short-wavelength ultraviolet light to kill microorganisms and disrupt their DNA in a process called ultraviolet germicidal irradiation. The complete robot system is capable of mapping the space — in this case, GBFB’s warehouse — and navigating between waypoints and other specified areas. In testing the system, the team used a UV-C dosimeter, which confirmed that the robot was delivering the expected dosage of UV-C light predicted by the model.

[ MIT ]

While it’s hard enough to get quadrupedal robots to walk in complex environments, this work from the Robotic Systems Lab at ETH Zurich shows some impressive whole body planning that allows ANYmal to squeeze its body through small or weirdly shaped spaces.

[ RSL ]

Engineering researchers at North Carolina State University and Temple University have developed soft robots inspired by jellyfish that can outswim their real-life counterparts. More practically, the new jellyfish-bots highlight a technique that uses pre-stressed polymers to make soft robots more powerful.

The researchers also used the technique to make a fast-moving robot that resembles a larval insect curling its body, then jumping forward as it quickly releases its stored energy. Lastly, the researchers created a three-pronged gripping robot – with a twist. Most grippers hang open when “relaxed,” and require energy to hold on to their cargo as it is lifted and moved from point A to point B. But this claw’s default position is clenched shut. Energy is required to open the grippers, but once they’re in position, the grippers return to their “resting” mode – holding their cargo tight.

[ NC State ]

As control skills increase, we are more and more impressed by what a Cassie bipedal robot can do. Those who have been following our channel, know that we always show the limitations of our work. So while there is still much to do, you gotta like the direction things are going. Later this year, you will see this controller integrated with our real-time planner and perception system. Autonomy with agility! Watch out for us!

[ University of Michigan ]

GITAI’s S1 arm is a little less exciting than their humanoid torso, but it looks like this one might actually be going to the ISS next year.

Here’s how the humanoid would handle a similar task:

[ GITAI ]

Thanks Fan!

If you need a robot that can lift 250 kg at 10 m/s across a workspace of a thousand cubic meters, here’s your answer.

[ Fraunhofer ]

Penn engineers with funding from the National Science Foundation, have nanocardboard plates able to levitate when bright light is shone on them. This fleet of tiny aircraft could someday explore the skies of other worlds, including Mars. The thinner atmosphere there would give the flyers a boost, enabling them to carry payloads ten times as massive as they are, making them an efficient, light-weight alternative to the Mars helicopter.

[ UPenn ]

Erin Sparks, assistant professor in Plant and Soil Sciences, dreamed of a robot she could use in her research. A perfect partnership was formed when Adam Stager, then a mechanical engineering Ph.D. student, reached out about a robot he had a gut feeling might be useful in agriculture. The pair moved forward with their research with corn at the UD Farm, using the robot to capture dynamic phenotyping information of brace roots over time.

[ Sparks Lab ]

This is a video about robot spy turtles but OMG that bird drone landing gear.

[ PBS ]

If you have a DJI Mavic, you now have something new to worry about.

[ DroGone ]

I was able to spot just one single person in the warehouse footage in this video.

[ Berkshire Grey ]

Flyability has partnered with the ROBINS Project to help fill gaps in the technology used in ship inspections. Watch this video to learn more about the ROBINS project and how Flyability’s drones for confined spaces are helping make inspections on ships safer, cheaper, and more efficient.

[ Flyability ]

In this video, a mission of the Alpha Aerial Scout of Team CERBERUS during the DARPA Subterranean Challenge Urban Circuit event is presented. The Alpha Robot operates inside the Satsop Abandoned Power Plant and performs autonomous exploration. This deployment took place during the 3rd field trial of team CERBERUS during the Urban Circuit event of the DARPA Subterranean Challenge.

[ ARL ]

More excellent talks from the remote Legged Robots ICRA workshop- we’ve posted three here, but there are several other good talks this week as well.

[ ICRA 2020 Legged Robots Workshop ] Continue reading

Posted in Human Robots