Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#439863 Q&A: Ghost Robotics CEO on Armed ...

Last week, the Association of the United States Army (AUSA) conference took place in Washington, D.C. One of the exhibitors was Ghost Robotics—we've previously covered their nimble and dynamic quadrupedal robots, which originated at the University of Pennsylvania with Minitaur in 2016. Since then, Ghost has developed larger, ruggedized “quadrupedal unmanned ground vehicles” (Q-UGVs) suitable for a variety of applications, one of which is military.

At AUSA, Ghost had a variety of its Vision 60 robots on display with a selection of defense-oriented payloads, including the system above, which is a remotely controlled rifle customized for the robot by a company called SWORD International.

The image of a futuristic-looking, potentially lethal weapon on a quadrupedal robot has generated some very strong reactions (the majority of them negative) in the media as well as on social media over the past few days. We recently spoke with Ghost Robotics' CEO Jiren Parikh to understand exactly what was being shown at AUSA, and to get his perspective on providing the military with armed autonomous robots.
IEEE Spectrum: Can you describe the level of autonomy that your robot has, as well as the level of autonomy that the payload has?

Jiren Parikh: It's critical to separate the two. The SPUR, or Special Purpose Unmanned Rifle from SWORD Defense, has no autonomy and no AI. It's triggered from a distance, and that has to be done by a human. There is always an operator in the loop. SWORD's customers include special operations teams worldwide, and when SWORD contacted us through a former special ops team member, the idea was to create a walking tripod proof of concept. They wanted a way of keeping the human who would otherwise have to pull the trigger at a distance from the weapon, to minimize the danger that they'd be in. We thought it was a great idea.
Our robot is also not autonomous. It's remotely operated with an operator in the loop. It does have perception for object avoidance for the environment because we need it to be able to walk around things and remain stable on unstructured terrain, and the operator has the ability to set GPS waypoints so it travels to a specific location. There's no targeting or weapons-related AI, and we have no intention of doing that. We support SWORD Defense like we do any other military, public safety or enterprise payload partner, and don't have any intention of selling weapons payloads.

Who is currently using your robots?
We have more than 20 worldwide government customers from various agencies, US and allied, who abide by very strict rules. You can see it and feel it when you talk to any of these agencies; they are not pro-autonomous weapons. I think they also recognize that they have to be careful about what they introduce. The vast majority of our customers are using them or developing applications for CBRNE [Chemical, Biological, Radiological, Nuclear, and Explosives detection], reconnaissance, target acquisition, confined space and subterranean inspection, mapping, EOD safety, wireless mesh networks, perimeter security and other applications where they want a better option than tracked and wheeled robots that are less agile and capable.

We also have agencies that do work where we are not privy to details. We sell them our robot and they can use it with any software, any radio, and any payload, and the folks that are using these systems, they're probably special teams, WMD and CBRN units and other special units doing confidential or classified operations in remote locations. We can only assume that a lot of our customers are doing really difficult, dangerous work. And remember that these are men and women who can't talk about what they do, with families who are under constant stress. So all we're trying to do is allow them to use our robot in military and other government agency applications to keep our people from getting hurt. That's what we promote. And if it's a weapon that they need to put on our robot to do their job, we're happy for them to do that. No different than any other dual use technology company that sells to defense or other government agencies.
How is what Ghost Robotics had on display at AUSA functionally different from other armed robotic platforms that have been around for well over a decade?

Decades ago, we had guided missiles, which are basically robots with weapons on them. People don't consider it a robot, but that's what it is. More recently, there have been drones and ground robots with weapons on them. But they didn't have legs, and they're not invoking this evolutionary memory of predators. And now add science fiction movies and social media to that, which we have no control over—the challenge for us is that legged robots are fascinating, and science fiction has made them scary. So I think we're going to have to socialize these kinds of legged systems over the next five to ten years in small steps, and hopefully people get used to them and understand the benefits for our soldiers. But we know it can be frightening. We also have families, and we think about these things as well.

“If our robot had tracks on it instead of legs, nobody would be paying attention.”
—Jiren Parikh
Are you concerned that showing legged robots with weapons will further amplify this perception problem, and make people less likely to accept them?
In the short term, weeks or months, yes. I think if you're talking about a year or two, no. We will get used to these robots just like armed drones, they just have to be socialized. If our robot had tracks on it instead of legs, nobody would be paying attention. We just have to get used to robots with legs.

More broadly, how does Ghost Robotics think armed robots should or should not be used?

I think there is a critical place for these robots in the military. Our military is here to protect us, and there are servicemen and women who are putting their lives on the line everyday to protect the United States and allies. I do not want them to lack for our robot with whatever payload, including weapons systems, if they need it to do their job and keep us safe. And if we've saved one life because these people had our robot when they needed it, I think that's something to be proud of.

I'll tell you personally: until I joined Ghost Robotics, I was oblivious to the amount of stress and turmoil and pain our servicemen and women go through to protect us. Some of the special operations folks that we talk to, they can't disclose what they do, but you can feel it when they talk about their colleagues and comrades that they've lost. The amount of energy that's put into protecting us by these people that we don't even know is really amazing, and we take it for granted.

What about in the context of police rather than the military?

I don't see that happening. We've just started talking with law enforcement, but we haven't had any inquiries on weapons. It's been hazmat, CBRNE, recon of confined spaces and crime scenes or sending robots in to talk with people that are barricaded or involved in a hostage situation. I don't think you're going to see the police using weaponized robots. In other countries, it's certainly possible, but I believe that it won't happen here. We live in a country where our military is run by a very strict set of rules, and we have this political and civilian backstop on how engagements should be conducted with new technologies.

How do you feel about the push for regulation of lethal autonomous weapons?

We're all for regulation. We're all for it. This is something everybody should be for right now. What those regulations are, what you can or can't do and how AI is deployed, I think that's for politicians and the armed services to decide. The question is whether the rest of the world will abide by it, and so we have to be realistic and we have to be ready to support defending ourselves against rogue nations or terrorist organizations that feel differently. Sticking your head in the sand is not the solution.

Based on the response that you've experienced over the past several days, will you be doing anything differently going forward?

We're very committed to what we're doing, and our team here understands our mission. We're not going to be reactive. And we're going to stick by our commitment to our US and allied government customers. We're going to help them do whatever they need to do, with whatever payload they need, to do their job, and do it safely. We are very fortunate to live in a country where the use of military force is a last resort, and the use of new technologies and weapons takes years and involves considerable deliberation from the armed services with civilian oversight. Continue reading

Posted in Human Robots

#439861 Researchers successfully build ...

As a robotics engineer, Yasemin Ozkan-Aydin, assistant professor of electrical engineering at the University of Notre Dame, gets her inspiration from biological systems. The collective behaviors of ants, honeybees and birds to solve problems and overcome obstacles is something researchers have developed in aerial and underwater robotics. Developing small-scale swarm robots with the capability to traverse complex terrain, however, comes with a unique set of challenges. Continue reading

Posted in Human Robots

#439857 Video Friday: ANYmals and Animals

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ROSCon 2021 – October 20-21, 2021 – [Online Event]Silicon Valley Robot Block Party – October 23, 2021 – Oakland, CA, USASSRR 2021 – October 25-27, 2021 – New York, NY, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

This project investigates the interaction between robots and animals, in particular, the quadruped ANYmal and wild vervet monkeys. We will test whether robots can be tolerated but also socially accepted in a group of vervets. We will evaluate whether social bonds are created between them and whether vervets trust knowledge from robots.

[ RSL ]

At this year's ACM Symposium on User Interface Software and Technology (UIST), the Student Innovation Contest was based around Sony Toio robots. Here are some of the things that teams came up with:

[ UIST ]

Collecting samples from Mars and bringing them back to Earth will be a historic undertaking that started with the launch of NASA's Perseverance rover on July 30, 2020. Perseverance collected its first rock core samples in September 2021. The rover will leave them on Mars for a future mission to retrieve and return to Earth. NASA and the European Space Agency (ESA) are solidifying concepts for this proposed Mars Sample Return campaign. The current concept includes a lander, a fetch rover, an ascent vehicle to launch the sample container to Martian orbit, and a retrieval spacecraft with a payload for capturing and containing the samples and then sending them back to Earth to land in an unpopulated area.

[ JPL ]

FCSTAR is a minimally actuated flying climbing robot capable of crawling vertically. It is the latest in the family of the STAR robots. Designed and built at the Bio-Inspired and Medical Robotics Lab at the Ben Gurion University of the Negev by Nitzan Ben David and David Zarrouk.

[ BGU ]

Evidently the novelty of Spot has not quite worn off yet.

[ IRL ]

As much as I like Covariant, it seems weird to call a robot like this “Waldo” when the world waldo already has a specific meaning in robotics, thanks to the short story by Robert A. Heinlein.

Also, kinda looks like it failed that very first pick in the video…?

[ Covariant ]

Thanks, Alice!

Here is how I will be assembling the Digit that I'm sure Agility Robotics will be sending me any day now.

[ Agility Robotics ]

Robotis would like to remind you that ROS World is next week, and also that they make a lot of ROS-friendly robots!

[ ROS World ] via [ Robotis ]

Researchers at the Australian UTS School of Architecture have partnered with construction design firm BVN Architecture to develop a unique 3D printed air-diffusion system.

[ UTS ]

Team MARBLE, who took third at the DARPA SubT Challenge, has put together this video which combines DARPA's videos with footage taken by the team to tell the whole story with some behind the scenes stuff thrown in.

[ MARBLE ]

You probably don't need to watch all 10 minutes of the first public flight of Volocopter's cargo drone, but it's fun to see the propellers spin up for the takeoff.

[ Volocopter ]

Nothing new in this video about Boston Dynamics from CNBC, but it's always cool to see a little wander around their headquarters.

[ CNBC ]

Computing power doubles every two years, an observation known as Moore's Law. Prof Maarten Steinbuch, a high-tech systems scientist, entrepreneur and communicator, from Eindhoven University of Technology, discussed how this exponential rate of change enables accelerating developments in sensor technology, AI computing and automotive machines, to make products in modern factories that will soon be smart and self-learning.

[ ESA ]

On episode three of The Robot Brains Podcast, we have deep learning pioneer: Yann LeCun. Yann is a winner of the Turing Award (often called the Nobel Prize of Computer Science) who in 2013 was handpicked by Mark Zuckerberg to bring AI to Facebook. Yann also offers his predictions for the future of artificial general intelligence, talks about his life straddling the worlds of academia and business and explains why he likes to picture AI as a chocolate layer cake with a cherry on top.

[ Robot Brains ]

This week's CMU RI seminar is from Tom Howard at the University of Rochester, on “Enabling Grounded Language Communication for Human-Robot Teaming.”

[ CMU RI ]

A pair of talks from the Maryland Robotics Center, including Maggie Wigness from ARL and Dieter Fox from UW and NVIDIA.

[ Maryland Robotics ] Continue reading

Posted in Human Robots

#439853 This Week’s Awesome Tech Stories From ...

ARTIFICIAL INTELLIGENCE
Facebook Is Researching AI Systems That See, Hear, and Remember Everything You Do
James Vincent | The Verge
“[Facebook’s AI team] imagines AI systems that are constantly analyzing peoples’ lives using first-person video; recording what they see, do, and hear in order to help them with everyday tasks. Facebook’s researchers have outlined a series of skills it wants these systems to develop, including ‘episodic memory’ (answering questions like ‘where did I leave my keys?’) and ‘audio-visual diarization’ (remembering who said what when).”

ROBOTICS
Drone Delivers Lungs to Transplant Recipient, a Medical First
George Dvorsky | Gizmodo
“As the Canadian Press reports, some 80% of donated lungs cannot be used owing to problems having to do with insufficient oxygenation or a failure to meet minimal functional standards. And like any transplanted organ, time is of the essence; the quicker an organ can be brought to the patient, the better. Hence the desire to ship organs through the air, rather than through congested city traffic.”

SPACE
At 90, William Shatner Becomes the Oldest Person to Reach ‘the Final Frontier’
Daniel E. Slotnick | The New York Times
“The actor spoke of how the experience of seeing the blue earth from space and the immense blackness of outer space had profoundly moved him, demonstrating what he called the ‘vulnerability of everything.’ The atmosphere keeping humanity alive is ‘thinner that your skin,’ he said.”

SECURITY
Fraudsters Cloned Company Director’s Voice in $35 Million Bank Heist, Police Find
Thomas Brewster | Forbes
“What [the bank manager] didn’t know was that he’d been duped as part of an elaborate swindle, one in which fraudsters had used ‘deep voice’ technology to clone the director’s speech, according to a court document unearthed by Forbes in which the U.A.E. has sought American investigators’ help in tracing $400,000 of stolen funds that went into US-based accounts held by Centennial Bank.”

CRYPTOCURRENCY
This Is the True Scale of China’s Bitcoin Exodus
Gian M. Volpicelli | Wired UK
“The figures, gathered by the Cambridge Centre for Alternative Finance (CCAF) found that by the end of August 2021, the percentage of bitcoin mining taking place in China had ‘effectively dropped to zero.’ That is a staggering reversal for a country that, as late as September 2019, was believed to be home to 75.53 percent of global bitcoin mining operations.”

TRANSPORTATION
90% of New Cars Sold in Norway Are Now Electric or Plug-in Hybrids
Adele Peters | Fast Company
“In 2012, electric and plug-in hybrid cars made up just 3% of new car sales in Norway. By 2019, that had jumped to 56%. Now, the country wants to get to 100% EV sales by 2025—and it might actually succeed. The Norwegian Automobile Federation recently reported that if past trends continue, it’s possible that the last fossil fuel-powered vehicle in Norway might be sold as soon as next year.”

FUTURE
Pentagon Wants AI to Predict Events Before They Occur
Natasha Bajema | IEEE Spectrum
“What if by leveraging today’s artificial intelligence to predict events several days in advance, countries like the United States could simply avoid warfare in the first place? It sounds like the ultimate form of deterrence, a strategy that would save everyone all sorts of trouble and it’s the type of visionary thinking that is driving U.S. military commanders and senior defense policymakers toward the rapid adoption of artificial intelligence (AI)-enabled situational awareness platforms.”

DIGITAL MEDIA
AI Fake-Face Generators Can Be Rewound to Reveal the Real Faces They Trained On
Will Douglas Heaven | MIT Technology Review
“In a paper titled This Person (Probably) Exists, researchers show that many faces produced by GANs bear a striking resemblance to actual people who appear in the training data. The fake faces can effectively unmask the real faces the GAN was trained on, making it possible to expose the identity of those individuals.”

Image Credit: Lance Anderson / Unsplash Continue reading

Posted in Human Robots

#439849 Boots Full of Nickels Help Mini Cheetah ...

As quadrupedal robots learn to do more and more dynamic tasks, they're likely to spend more and more time not on their feet. Not falling over, necessarily (although that's inevitable of course, because they're legged robots after all)—but just being in flight in one way or another. The most risky of flight phases would be a fall from a substantial height, because it's almost certain to break your very expensive robot and any payload it might have.
Falls being bad is not a problem unique to robots, and it's not surprising that quadrupeds in nature have already solved it. Or at least, it's already been solved by cats, which are able to reliably land on their feet to mitigate fall damage. To teach quadrupedal robots this trick, roboticists from the University of Notre Dame have been teaching a Mini Cheetah quadruped some mid-air self-righting skills, with the aid of boots full of nickels.

If this research looks a little bit familiar, it's because we recently covered some work from ETH Zurich that looked at using legs to reorient their SpaceBok quadruped in microgravity. This work with Mini Cheetah has to contend with Earth gravity, however, which puts some fairly severe time constraints on the whole reorientation thing with the penalty for failure being a smashed-up robot rather than just a weird bounce. When we asked the ETH Zurich researchers what might improve the performance of SpaceBok, they told us that “heavy shoes would definitely help,” and it looks like the folks from Notre Dame had the same idea, which they were able to implement on Mini Cheetah.

Mini Cheetah's legs (like the legs of many robots) were specifically designed to be lightweight because they have to move quickly, and you want to minimize the mass that moves back and forth with every step to make the robot as efficient as possible. But for a robot to reorient itself in mid air, it's got to start swinging as much mass around as it can. Each of Mini Cheetah's legs has been modified with 3D printed boots, packed with two rolls of American nickels each, adding about 500g to each foot—enough to move the robot around like it needs to. The reason why nickel boots are important is because the only way that Mini Cheetah has of changing its orientation while falling is by flailing its legs around. When its legs move one way, its body will move the other way, and the heavier the legs are, the more force they can exert on the body.
As with everything robotics, getting the hardware to do what you want it to do is only half the battle. Or sometimes much, much less than half the battle. The challenge with Mini Cheetah flipping itself over is that it has a very, very small amount of time to figure out how to do it properly. It has to detect that it's falling, figure out what orientation it's in, make a plan of how to get itself feet down, and then execute on that plan successfully. The robot doesn't have enough time to put a whole heck of a lot of thought into things as it starts to plummet, so the technique that the researchers came up with to enable it to do what it needs to do is called a “reflex” approach. Vince Kurtz, first author on the paper describing this technique, explains how it works:
While trajectory optimization algorithms keep getting better and better, they still aren't quite fast enough to find a solution from scratch in the fraction of a second between when the robot detects a fall and when it needs to start a recovery motion. We got around this by dropping the robot a bunch of times in simulation, where we can take as much time as we need to find a solution, and training a neural network to imitate the trajectory optimizer. The trained neural network maps initial orientations to trajectories that land the robot on its feet. We call this the “reflex” approach, since the neural network has basically learned an automatic response that can be executed when the robot detects that it's falling.This technique works quite well, but there are a few constraints, most of which wouldn't seem so bad if we weren't comparing quadrupedal robots to quadrupedal animals. Cats are just, like, super competent at what they do, says Kurtz, and being able to mimic their ability to rapidly twist themselves into a favorable landing configuration from any starting orientation is just going to be really hard for a robot to pull off:
The more I do robotics research the more I appreciate how amazing nature is, and this project is a great example of that. Cats can do a full 180° rotation when dropped from about shoulder height. Our robot ran up against torque limits when rotating 90° from about 10ft off the ground. Using the full 3D motion would be a big improvement (rotating sideways should be easier because the robot's moment of inertia is smaller in that direction), though I'd be surprised if that alone got us to cat-level performance.
The biggest challenge that I see in going from 2D to 3D is self-collisions. Keeping the robot from hitting itself seems like it should be simple, but self-collisions turn out to impose rather nasty non-convex constraints that make it numerically difficult (though not impossible) for trajectory optimization algorithms to find high-quality solutions.Lastly, we asked Kurtz to talk a bit about whether it's worth exploring flexible actuated spines for quadrupedal robots. We know that such spines offer many advantages (a distant relative of Mini Cheetah had one, for example), but that they're also quite complex. So is it worth it?
This is an interesting question. Certainly in the case of the falling cat problem a flexible spine would help, both in terms of having a naturally flexible mass distribution and in terms of controller design, since we might be able to directly imitate the “bend-and-twist” motion of cats. Similarly, a flexible spine might help for tasks with large flight phases, like the jumping in space problems discussed in the ETH paper.
With that being said, mid-air reorientation is not the primary task of most quadruped robots, and it's not obvious to me that a flexible spine would help much for walking, running, or scrambling over uneven terrain. Also, existing hardware platforms with rigid backs like the Mini Cheetah are quite capable and I think we still haven't unlocked the full potential of these robots. Control algorithms are still the primary limiting factor for today's legged robots, and adding a flexible spine would probably make for even more difficult control problems.Mini Cheetah, the Falling Cat: A Case Study in Machine Learning and Trajectory Optimization for Robot Acrobatics, by Vince Kurtz, He Li, Patrick M. Wensing, and Hai Lin from University of Notre Dame, is available on arXiv. Continue reading

Posted in Human Robots