Tag Archives: CEO
#439863 Q&A: Ghost Robotics CEO on Armed ...
Last week, the Association of the United States Army (AUSA) conference took place in Washington, D.C. One of the exhibitors was Ghost Robotics—we've previously covered their nimble and dynamic quadrupedal robots, which originated at the University of Pennsylvania with Minitaur in 2016. Since then, Ghost has developed larger, ruggedized “quadrupedal unmanned ground vehicles” (Q-UGVs) suitable for a variety of applications, one of which is military.
At AUSA, Ghost had a variety of its Vision 60 robots on display with a selection of defense-oriented payloads, including the system above, which is a remotely controlled rifle customized for the robot by a company called SWORD International.
The image of a futuristic-looking, potentially lethal weapon on a quadrupedal robot has generated some very strong reactions (the majority of them negative) in the media as well as on social media over the past few days. We recently spoke with Ghost Robotics' CEO Jiren Parikh to understand exactly what was being shown at AUSA, and to get his perspective on providing the military with armed autonomous robots.
IEEE Spectrum: Can you describe the level of autonomy that your robot has, as well as the level of autonomy that the payload has?
Jiren Parikh: It's critical to separate the two. The SPUR, or Special Purpose Unmanned Rifle from SWORD Defense, has no autonomy and no AI. It's triggered from a distance, and that has to be done by a human. There is always an operator in the loop. SWORD's customers include special operations teams worldwide, and when SWORD contacted us through a former special ops team member, the idea was to create a walking tripod proof of concept. They wanted a way of keeping the human who would otherwise have to pull the trigger at a distance from the weapon, to minimize the danger that they'd be in. We thought it was a great idea.
Our robot is also not autonomous. It's remotely operated with an operator in the loop. It does have perception for object avoidance for the environment because we need it to be able to walk around things and remain stable on unstructured terrain, and the operator has the ability to set GPS waypoints so it travels to a specific location. There's no targeting or weapons-related AI, and we have no intention of doing that. We support SWORD Defense like we do any other military, public safety or enterprise payload partner, and don't have any intention of selling weapons payloads.
Who is currently using your robots?
We have more than 20 worldwide government customers from various agencies, US and allied, who abide by very strict rules. You can see it and feel it when you talk to any of these agencies; they are not pro-autonomous weapons. I think they also recognize that they have to be careful about what they introduce. The vast majority of our customers are using them or developing applications for CBRNE [Chemical, Biological, Radiological, Nuclear, and Explosives detection], reconnaissance, target acquisition, confined space and subterranean inspection, mapping, EOD safety, wireless mesh networks, perimeter security and other applications where they want a better option than tracked and wheeled robots that are less agile and capable.
We also have agencies that do work where we are not privy to details. We sell them our robot and they can use it with any software, any radio, and any payload, and the folks that are using these systems, they're probably special teams, WMD and CBRN units and other special units doing confidential or classified operations in remote locations. We can only assume that a lot of our customers are doing really difficult, dangerous work. And remember that these are men and women who can't talk about what they do, with families who are under constant stress. So all we're trying to do is allow them to use our robot in military and other government agency applications to keep our people from getting hurt. That's what we promote. And if it's a weapon that they need to put on our robot to do their job, we're happy for them to do that. No different than any other dual use technology company that sells to defense or other government agencies.
How is what Ghost Robotics had on display at AUSA functionally different from other armed robotic platforms that have been around for well over a decade?
Decades ago, we had guided missiles, which are basically robots with weapons on them. People don't consider it a robot, but that's what it is. More recently, there have been drones and ground robots with weapons on them. But they didn't have legs, and they're not invoking this evolutionary memory of predators. And now add science fiction movies and social media to that, which we have no control over—the challenge for us is that legged robots are fascinating, and science fiction has made them scary. So I think we're going to have to socialize these kinds of legged systems over the next five to ten years in small steps, and hopefully people get used to them and understand the benefits for our soldiers. But we know it can be frightening. We also have families, and we think about these things as well.
“If our robot had tracks on it instead of legs, nobody would be paying attention.”
—Jiren Parikh
Are you concerned that showing legged robots with weapons will further amplify this perception problem, and make people less likely to accept them?
In the short term, weeks or months, yes. I think if you're talking about a year or two, no. We will get used to these robots just like armed drones, they just have to be socialized. If our robot had tracks on it instead of legs, nobody would be paying attention. We just have to get used to robots with legs.
More broadly, how does Ghost Robotics think armed robots should or should not be used?
I think there is a critical place for these robots in the military. Our military is here to protect us, and there are servicemen and women who are putting their lives on the line everyday to protect the United States and allies. I do not want them to lack for our robot with whatever payload, including weapons systems, if they need it to do their job and keep us safe. And if we've saved one life because these people had our robot when they needed it, I think that's something to be proud of.
I'll tell you personally: until I joined Ghost Robotics, I was oblivious to the amount of stress and turmoil and pain our servicemen and women go through to protect us. Some of the special operations folks that we talk to, they can't disclose what they do, but you can feel it when they talk about their colleagues and comrades that they've lost. The amount of energy that's put into protecting us by these people that we don't even know is really amazing, and we take it for granted.
What about in the context of police rather than the military?
I don't see that happening. We've just started talking with law enforcement, but we haven't had any inquiries on weapons. It's been hazmat, CBRNE, recon of confined spaces and crime scenes or sending robots in to talk with people that are barricaded or involved in a hostage situation. I don't think you're going to see the police using weaponized robots. In other countries, it's certainly possible, but I believe that it won't happen here. We live in a country where our military is run by a very strict set of rules, and we have this political and civilian backstop on how engagements should be conducted with new technologies.
How do you feel about the push for regulation of lethal autonomous weapons?
We're all for regulation. We're all for it. This is something everybody should be for right now. What those regulations are, what you can or can't do and how AI is deployed, I think that's for politicians and the armed services to decide. The question is whether the rest of the world will abide by it, and so we have to be realistic and we have to be ready to support defending ourselves against rogue nations or terrorist organizations that feel differently. Sticking your head in the sand is not the solution.
Based on the response that you've experienced over the past several days, will you be doing anything differently going forward?
We're very committed to what we're doing, and our team here understands our mission. We're not going to be reactive. And we're going to stick by our commitment to our US and allied government customers. We're going to help them do whatever they need to do, with whatever payload they need, to do their job, and do it safely. We are very fortunate to live in a country where the use of military force is a last resort, and the use of new technologies and weapons takes years and involves considerable deliberation from the armed services with civilian oversight. Continue reading
#438762 When Robots Enter the World, Who Is ...
Over the last half decade or so, the commercialization of autonomous robots that can operate outside of structured environments has dramatically increased. But this relatively new transition of robotic technologies from research projects to commercial products comes with its share of challenges, many of which relate to the rapidly increasing visibility that these robots have in society.
Whether it's because of their appearance of agency, or because of their history in popular culture, robots frequently inspire people’s imagination. Sometimes this is a good thing, like when it leads to innovative new use cases. And sometimes this is a bad thing, like when it leads to use cases that could be classified as irresponsible or unethical. Can the people selling robots do anything about the latter? And even if they can, should they?
Roboticists understand that robots, fundamentally, are tools. We build them, we program them, and even the autonomous ones are just following the instructions that we’ve coded into them. However, that same appearance of agency that makes robots so compelling means that it may not be clear to people without much experience with or exposure to real robots that a robot itself isn’t inherently good or bad—rather, as a tool, a robot is a reflection of its designers and users.
This can put robotics companies into a difficult position. When they sell a robot to someone, that person can, hypothetically, use the robot in any way they want. Of course, this is the case with every tool, but it’s the autonomous aspect that makes robots unique. I would argue that autonomy brings with it an implied association between a robot and its maker, or in this case, the company that develops and sells it. I’m not saying that this association is necessarily a reasonable one, but I think that it exists, even if that robot has been sold to someone else who has assumed full control over everything it does.
“All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon”
—Robert Playter, Boston Dynamics
Robotics companies are certainly aware of this, because many of them are very careful about who they sell their robots to, and very explicit about what they want their robots to be doing. But once a robot is out in the wild, as it were, how far should that responsibility extend? And realistically, how far can it extend? Should robotics companies be held accountable for what their robots do in the world, or should we accept that once a robot is sold to someone else, responsibility is transferred as well? And what can be done if a robot is being used in an irresponsible or unethical way that could have a negative impact on the robotics community?
For perspective on this, we contacted folks from three different robotics companies, each of which has experience selling distinctive mobile robots to commercial end users. We asked them the same five questions about the responsibility that robotics companies have regarding the robots that they sell, and here’s what they had to say:
Do you have any restrictions on what people can do with your robots? If so, what are they, and if not, why not?
Péter Fankhauser, CEO, ANYbotics:
We closely work together with our customers to make sure that our solution provides the right approach for their problem. Thereby, the target use case is clear from the beginning and we do not work with customers interested in using our robot ANYmal outside the intended target applications. Specifically, we strictly exclude any military or weaponized uses and since the foundation of ANYbotics it is close to our heart to make human work easier, safer, and more enjoyable.
Robert Playter, CEO, Boston Dynamics:
Yes, we have restrictions on what people can do with our robots, which are outlined in our Terms and Conditions of Sale. All of our buyers, without exception, must agree that Spot will not be used to harm or intimidate people or animals, as a weapon or configured to hold a weapon. Spot, just like any product, must be used in compliance with the law.
Ryan Gariepy, CTO, Clearpath Robotics:
We do have strict restrictions and KYC processes which are based primarily on Canadian export control regulations. They depend on the type of equipment sold as well as where it is going. More generally, we also will not sell or support a robot if we know that it will create an uncontrolled safety hazard or if we have reason to believe that the buyer is unqualified to use the product. And, as always, we do not support using our products for the development of fully autonomous weapons systems.
More broadly, if you sell someone a robot, why should they be restricted in what they can do with it?
Péter Fankhauser, ANYbotics: We see the robot less as a simple object but more as an artificial workforce. This implies to us that the usage is closely coupled with the transfer of the robot and both the customer and the provider agree what the robot is expected to do. This approach is supported by what we hear from our customers with an increasing interest to pay for the robots as a service or per use.
Robert Playter, Boston Dynamics: We’re offering a product for sale. We’re going to do the best we can to stop bad actors from using our technology for harm, but we don’t have the control to regulate every use. That said, we believe that our business will be best served if our technology is used for peaceful purposes—to work alongside people as trusted assistants and remove them from harm’s way. We do not want to see our technology used to cause harm or promote violence. Our restrictions are similar to those of other manufacturers or technology companies that take steps to reduce or eliminate the violent or unlawful use of their products.
Ryan Gariepy, Clearpath Robotics: Assuming the organization doing the restricting is a private organization and the robot and its software is sold vs. leased or “managed,” there aren't strong legal reasons to restrict use. That being said, the manufacturer likewise has no obligation to continue supporting that specific robot or customer going forward. However, given that we are only at the very edge of how robots will reshape a great deal of society, it is in the best interest for the manufacturer and user to be honest with each other about their respective goals. Right now, you're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.
“If a robot is being used in a way that is irresponsible due to safety: intervene! If it’s unethical: speak up!”
—Péter Fankhauser, ANYbotics
What can you realistically do to make sure that people who buy your robots use them in the ways that you intend?
Péter Fankhauser, ANYbotics: We maintain a close collaboration with our customers to ensure their success with our solution. So for us, we have refrained from technical solutions to block unintended use.
Robert Playter, Boston Dynamics: We vet our customers to make sure that their desired applications are things that Spot can support, and are in alignment with our Terms and Conditions of Sale. We’ve turned away customers whose applications aren’t a good match with our technology. If customers misuse our technology, we’re clear in our Terms of Sale that their violations may void our warranty and prevent their robots from being updated, serviced, repaired, or replaced. We may also repossess robots that are not purchased, but leased. Finally, we will refuse future sales to customers that violate our Terms of Sale.
Ryan Gariepy, Clearpath Robotics: We typically work with our clients ahead of the purchase to make sure their expectations match reality, in particular on aspects like safety, supervisory requirements, and usability. It's far worse to sell a robot that'll sit on a shelf or worse, cause harm, then to not sell a robot at all, so we prefer to reduce the risk of this situation in advance of receiving an order or shipping a robot.
How do you evaluate the merit of edge cases, for example if someone wants to use your robot in research or art that may push the boundaries of what you personally think is responsible or ethical?
Péter Fankhauser, ANYbotics: It’s about the dialog, understanding, and figuring out alternatives that work for all involved parties and the earlier you can have this dialog the better.
Robert Playter, Boston Dynamics: There’s a clear line between exploring robots in research and art, and using the robot for violent or illegal purposes.
Ryan Gariepy, Clearpath Robotics: We have sold thousands of robots to hundreds of clients, and I do not recall the last situation that was not covered by a combination of export control and a general evaluation of the client's goals and expectations. I'm sure this will change as robots continue to drop in price and increase in flexibility and usability.
“You're not only investing in the initial purchase and relationship, you're investing in the promise of how you can help each other succeed in the future.”
—Ryan Gariepy, Clearpath Robotics
What should roboticists do if we see a robot being used in a way that we feel is unethical or irresponsible?
Péter Fankhauser, ANYbotics: If it’s irresponsible due to safety: intervene! If it’s unethical: speak up!
Robert Playter, Boston Dynamics: We want robots to be beneficial for humanity, which includes the notion of not causing harm. As an industry, we think robots will achieve long-term commercial viability only if people see robots as helpful, beneficial tools without worrying if they’re going to cause harm.
Ryan Gariepy, Clearpath Robotics: On a one off basis, they should speak to a combination of the user, the supplier or suppliers, the media, and, if safety is an immediate concern, regulatory or government agencies. If the situation in question risks becoming commonplace and is not being taken seriously, they should speak up more generally in appropriate forums—conferences, industry groups, standards bodies, and the like.
As more and more robots representing different capabilities become commercially available, these issues are likely to come up more frequently. The three companies we talked to certainly don’t represent every viewpoint, and we did reach out to other companies who declined to comment. But I would think (I would hope?) that everyone in the robotics community can agree that robots should be used in a way that makes people’s lives better. What “better” means in the context of art and research and even robots in the military may not always be easy to define, and inevitably there’ll be disagreement as to what is ethical and responsible, and what isn’t.
We’ll keep on talking about it, though, and do our best to help the robotics community to continue growing and evolving in a positive way. Let us know what you think in the comments. Continue reading