Tag Archives: keep

#437707 Video Friday: This Robot Will Restock ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Tokyo startup Telexistence has recently unveiled a new robot called the Model-T, an advanced teleoperated humanoid that can use tools and grasp a wide range of objects. Japanese convenience store chain FamilyMart plans to test the Model-T to restock shelves in up to 20 stores by 2022. In the trial, a human “pilot” will operate the robot remotely, handling items like beverage bottles, rice balls, sandwiches, and bento boxes.

With Model-T and AWP, FamilyMart and TX aim to realize a completely new store operation by remoteizing and automating the merchandise restocking work, which requires a large number of labor-hours. As a result, stores can operate with less number of workers and enable them to recruit employees regardless of the store’s physical location.

[ Telexistence ]

Quadruped dance-off should be a new robotics competition at IROS or ICRA.

I dunno though, that moonwalk might keep Spot in the lead…

[ Unitree ]

Through a hybrid of simulation and real-life training, this air muscle robot is learning to play table tennis.

Table tennis requires to execute fast and precise motions. To gain precision it is necessary to explore in this high-speed regimes, however, exploration can be safety-critical at the same time. The combination of RL and muscular soft robots allows to close this gap. While robots actuated by pneumatic artificial muscles generate high forces that are required for e.g. smashing, they also offer safe execution of explosive motions due to antagonistic actuation.

To enable practical training without real balls, we introduce Hybrid Sim and Real Training (HYSR) that replays prerecorded real balls in simulation while executing actions on the real system. In this manner, RL can learn the challenging motor control of the PAM-driven robot while executing ~15000 hitting motions.

[ Max Planck Institute ]

Thanks Dieter!

Anthony Cowley wrote in to share his recent thesis work on UPSLAM, a fast and lightweight SLAM technique that records data in panoramic depth images (just PNGs) that are easy to visualize and even easier to share between robots, even on low-bandwidth networks.

[ UPenn ]

Thanks Anthony!

GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.

[ Gitai ]

The University of Michigan has a fancy new treadmill that’s built right into the floor, which proves to be a bit much for Mini Cheetah.

But Cassie Blue won’t get stuck on no treadmill! She goes for a 0.3 mile walk across campus, which ends when a certain someone ran the gantry into Cassie Blue’s foot.

[ Michigan Robotics ]

Some serious quadruped research going on at UT Austin Human Centered Robotics Lab.

[ HCRL ]

Will Burrard-Lucas has spent lockdown upgrading his slightly indestructible BeetleCam wildlife photographing robot.

[ Will Burrard-Lucas ]

Teleoperated surgical robots are becoming commonplace in operating rooms, but many are massive (sometimes taking up an entire room) and are difficult to manipulate, especially if a complication arises and the robot needs to removed from the patient. A new collaboration between the Wyss Institute, Harvard University, and Sony Corporation has created the mini-RCM, a surgical robot the size of a tennis ball that weighs as much as a penny, and performed significantly better than manually operated tools in delicate mock-surgical procedures. Importantly, its small size means it is more comparable to the human tissues and structures on which it operates, and it can easily be removed by hand if needed.

[ Harvard Wyss ]

Yaskawa appears to be working on a robot that can scan you with a temperature gun and then jam a mask on your face?

[ Motoman ]

Maybe we should just not have people working in mines anymore, how about that?

[ Exyn ]

Many current human-robot interactive systems tend to use accurate and fast – but also costly – actuators and tracking systems to establish working prototypes that are safe to use and deploy for user studies. This paper presents an embedded framework to build a desktop space for human-robot interaction, using an open-source robot arm, as well as two RGB cameras connected to a Raspberry Pi-based controller that allow a fast yet low-cost object tracking and manipulation in 3D. We show in our evaluations that this facilitates prototyping a number of systems in which user and robot arm can commonly interact with physical objects.

[ Paper ]

IBM Research is proud to host professor Yoshua Bengio — one of the world’s leading experts in AI — in a discussion of how AI can contribute to the fight against COVID-19.

[ IBM Research ]

Ira Pastor, ideaXme life sciences ambassador interviews Professor Dr. Hiroshi Ishiguro, the Director of the Intelligent Robotics Laboratory, of the Department of Systems Innovation, in the Graduate School of Engineering Science, at Osaka University, Japan.

[ ideaXme ]

A CVPR talk from Stanford’s Chelsea Finn on “Generalization in Visuomotor Learning.”

[ Stanford ] Continue reading

Posted in Human Robots

#437697 These Underwater Drones Use Water ...

Yi Chao likes to describe himself as an “armchair oceanographer” because he got incredibly seasick the one time he spent a week aboard a ship. So it’s maybe not surprising that the former NASA scientist has a vision for promoting remote study of the ocean on a grand scale by enabling underwater drones to recharge on the go using his company’s energy-harvesting technology.

Many of the robotic gliders and floating sensor stations currently monitoring the world’s oceans are effectively treated as disposable devices because the research community has a limited number of both ships and funding to retrieve drones after they’ve accomplished their mission of beaming data back home. That’s not only a waste of money, but may also contribute to a growing assortment of abandoned lithium-ion batteries polluting the ocean with their leaking toxic materials—a decidedly unsustainable approach to studying the secrets of the underwater world.

“Our goal is to deploy our energy harvesting system to use renewable energy to power those robots,” says Chao, president and CEO of the startup Seatrec. “We're going to save one battery at a time, so hopefully we're going to not to dispose more toxic batteries in the ocean.”

Chao’s California-based startup claims that its SL1 Thermal Energy Harvesting System can already help save researchers money equivalent to an order of magnitude reduction in the cost of using robotic probes for oceanographic data collection. The startup is working on adapting its system to work with autonomous underwater gliders. And it has partnered with defense giant Northrop Grumman to develop an underwater recharging station for oceangoing drones that incorporates Northrop Grumman’s self-insulating electrical connector capable of operating while the powered electrical contacts are submerged.

Seatrec’s energy-harvesting system works by taking advantage of how certain substances transition from solid-to-liquid phase and liquid-to-gas phase when they heat up. The company’s technology harnesses the pressure changes that result from such phase changes in order to generate electricity.

Image: Seatrec

To make the phase changes happen, Seatrec’s solution taps the temperature differences between warmer water at the ocean surface and colder water at the ocean depths. Even a relatively simple robotic probe can generate additional electricity by changing its buoyancy to either float at the surface or sink down into the colder depths.

By attaching an external energy-harvesting module, Seatrec has already begun transforming robotic probes into assets that can be recharged and reused more affordably than sending out a ship each time to retrieve the probes. This renewable energy approach could keep such drones going almost indefinitely barring electrical or mechanical failures. “We just attach the backpack to the robots, we give them a cable providing power, and they go into the ocean,” Chao explains.

The early buyers of Seatrec’s products are primarily academic researchers who use underwater drones to collect oceanographic data. But the startup has also attracted military and government interest. It has already received small business innovation research contracts from both the U.S. Office of Naval Research and National Oceanic and Atmospheric Administration (NOAA).

Seatrec has also won two $10,000 prizes under the Powering the Blue Economy: Ocean Observing Prize administered by the U.S. Department of Energy and NOAA. The prizes awarded during the DISCOVER Competition phase back in March 2020 included one prize split with Northrop Grumman for the joint Mission Unlimited UUV Station concept. The startup and defense giant are currently looking for a robotics company to partner with for the DEVELOP Competition phase of the Ocean Observing Prize that will offer a total of $3 million in prizes.

In the long run, Seatrec hopes its energy-harvesting technology can support commercial ventures such as the aquaculture industry that operates vast underwater farms. The technology could also support underwater drones carrying out seabed surveys that pave the way for deep sea mining ventures, although those are not without controversy because of their projected environmental impacts.

Among all the possible applications Chao seems especially enthusiastic about the prospect of Seatrec’s renewable power technology enabling underwater drones and floaters to collect oceanographic data for much longer periods of time. He spent the better part of two decades working at the NASA Jet Propulsion Laboratory in Pasadena, Calif., where he helped develop a satellite designed for monitoring the Earth’s oceans. But he and the JPL engineering team that developed Seatrec’s core technology believe that swarms of underwater drones can provide a continuous monitoring network to truly begin understanding the oceans in depth.

The COVID-19 pandemic has slowed production and delivery of Seatrec’s products somewhat given local shutdowns and supply chain disruptions. Still, the startup has been able to continue operating in part because it’s considered to be a defense contractor that is operating an essential manufacturing facility. Seatrec’s engineers and other staff members are working in shifts to practice social distancing.

“Rather than building one or two for the government, we want to scale up to build thousands, hundreds of thousands, hopefully millions, so we can improve our understanding and provide that data to the community,” Chao says. Continue reading

Posted in Human Robots

#437683 iRobot Remembers That Robots Are ...

iRobot has released several new robots over the last few years, including the i7 and s9 vacuums. Both of these models are very fancy and very capable, packed with innovative and useful features that we’ve been impressed by. They’re both also quite expensive—with dirt docks included, you’re looking at US $800 for the i7+, and a whopping $1,100 for the s9+. You can knock a couple hundred bucks off of those prices if you don’t want the docks, but still, these vacuums are absolutely luxury items.

If you just want something that’ll do some vacuuming so that you don’t have to, iRobot has recently announced a new Roomba option. The Roomba i3 is iRobot’s new low to midrange vacuum, starting at $400. It’s not nearly as smart as the i7 or the s9, but it can navigate (sort of) and make maps (sort of) and do some basic smart home integration. If that sounds like all you need, the i3 could be the robot vacuum for you.

iRobot calls the i3 “stylish,” and it does look pretty neat with that fabric top. Underneath, you get dual rubber primary brushes plus a side brush. There’s limited compatibility with the iRobot Home app and IFTTT, along with Alexa and Google Home. The i3 is also compatible with iRobot’s Clean Base, but that’ll cost you an extra $200, and iRobot refers to this bundle as the i3+.

The reason that the i3 only offers limited compatibility with iRobot’s app is that the i3 is missing the top-mounted camera that you’ll find in more expensive models. Instead, it relies on a downward-looking optical sensor to help it navigate, and it builds up a map as it’s cleaning by keeping track of when it bumps into obstacles and paying attention to internal sensors like a gyro and wheel odometers. The i3 can localize directly on its charging station or Clean Base (which have beacons on them that the robot can see if it’s close enough), which allows it to resume cleaning after emptying it’s bin or recharging. You’ll get a map of the area that the i3 has cleaned once it’s finished, but that map won’t persist between cleaning sessions, meaning that you can’t do things like set keep-out zones or identify specific rooms for the robot to clean. Many of the more useful features that iRobot’s app offers are based on persistent maps, and this is probably the biggest gap in functionality between the i3 and its more expensive siblings.

According to iRobot senior global product manager Sarah Wang, the kind of augmented dead-reckoning-based mapping that the i3 uses actually works really well: “Based on our internal and external testing, the performance is equivalent with our products that have cameras, like the Roomba 960,” she says. To get this level of performance, though, you do have to be careful, Wang adds. “If you kidnap i3, then it will be very confused, because it doesn’t have a reference to know where it is.” “Kidnapping” is a term that’s used often in robotics to refer to a situation in which an autonomous robot gets moved to an unmapped location, and in the context of a home robot, the best example of this is if you decide that you want your robot to vacuum a different room instead, so you pick it up and move it there.

iRobot used to make this easy by giving all of its robots carrying handles, but not anymore, because getting moved around makes things really difficult for any robot trying to keep track of where it is. While robots like the i7 can recover using their cameras to look for unique features that they recognize, the only permanent, unique landmark that the i3 can for sure identify is the beacon on its dock. What this means is that when it comes to the i3, even more than other Roomba models, the best strategy, is to just “let it do its thing,” says iRobot senior principal system engineer Landon Unninayar.

Photo: iRobot

The Roomba i3 is iRobot’s new low to midrange vacuum, starting at $400.

If you’re looking to spend a bit less than the $400 starting price of the i3, there are other options to be aware of as well. The Roomba 614, for example, does a totally decent job and costs $250. It’s scheduling isn’t very clever, it doesn’t make maps, and it won’t empty itself, but it will absolutely help keep your floors clean as long as you don’t mind being a little bit more hands-on. (And there’s also Neato’s D4, which offers basic persistent maps—and lasers!—for $330.)

The other thing to consider if you’re trying to decide between the i3 and a more expensive Roomba is that without the camera, the i3 likely won’t be able to take advantage of nearly as many of the future improvements that iRobot has said it’s working on. Spending more money on a robot with additional sensors isn’t just buying what it can do now, but also investing in what it may be able to do later on, with its more sophisticated localization and ability to recognize objects. iRobot has promised major app updates every six months, and our guess is that most of the cool new stuff is going to show in the i7 and s9. So, if your top priority is just cleaner floors, the i3 is a solid choice. But if you want a part of what iRobot is working on next, the i3 might end up holding you back. Continue reading

Posted in Human Robots

#437645 How Robots Became Essential Workers in ...

Photo: Sivaram V/Reuters

A robot, developed by Asimov Robotics to spread awareness about the coronavirus, holds a tray with face masks and sanitizer.

As the coronavirus emergency exploded into a full-blown pandemic in early 2020, forcing countless businesses to shutter, robot-making companies found themselves in an unusual situation: Many saw a surge in orders. Robots don’t need masks, can be easily disinfected, and, of course, they don’t get sick.

An army of automatons has since been deployed all over the world to help with the crisis: They are monitoring patients, sanitizing hospitals, making deliveries, and helping frontline medical workers reduce their exposure to the virus. Not all robots operate autonomously—many, in fact, require direct human supervision, and most are limited to simple, repetitive tasks. But robot makers say the experience they’ve gained during this trial-by-fire deployment will make their future machines smarter and more capable. These photos illustrate how robots are helping us fight this pandemic—and how they might be able to assist with the next one.

DROID TEAM

Photo: Clement Uwiringiyimana/Reuters

A squad of robots serves as the first line of defense against person-to-person transmission at a medical center in Kigali, Rwanda. Patients walking into the facility get their temperature checked by the machines, which are equipped with thermal cameras atop their heads. Developed by UBTech Robotics, in China, the robots also use their distinctive appearance—they resemble characters out of a Star Wars movie—to get people’s attention and remind them to wash their hands and wear masks.

Photo: Clement Uwiringiyimana/Reuters

SAY “AAH”
To speed up COVID-19 testing, a team of Danish doctors and engineers at the University of Southern Denmark and at Lifeline Robotics is developing a fully automated swab robot. It uses computer vision and machine learning to identify the perfect target spot inside the person’s throat; then a robotic arm with a long swab reaches in to collect the sample—all done with a swiftness and consistency that humans can’t match. In this photo, one of the creators, Esben Østergaard, puts his neck on the line to demonstrate that the robot is safe.

Photo: University of Southern Denmark

GERM ZAPPER
After six of its doctors became infected with the coronavirus, the Sassarese hospital in Sardinia, Italy, tightened its safety measures. It also brought in the robots. The machines, developed by UVD Robots, use lidar to navigate autonomously. Each bot carries an array of powerful short-wavelength ultraviolet-C lights that destroy the genetic material of viruses and other pathogens after a few minutes of exposure. Now there is a spike in demand for UV-disinfection robots as hospitals worldwide deploy them to sterilize intensive care units and operating theaters.

Photo: UVD Robots

RUNNING ERRANDS

In medical facilities, an ideal role for robots is taking over repetitive chores so that nurses and physicians can spend their time doing more important tasks. At Shenzhen Third People’s Hospital, in China, a robot called Aimbot drives down the hallways, enforcing face-mask and social-distancing rules and spraying disinfectant. At a hospital near Austin, Texas, a humanoid robot developed by Diligent Robotics fetches supplies and brings them to patients’ rooms. It repeats this task day and night, tirelessly, allowing the hospital staff to spend more time interacting with patients.

Photos, left: Diligent Robotics; Right: UBTech Robotics

THE DOCTOR IS IN
Nurses and doctors at Circolo Hospital in Varese, in northern Italy—the country’s hardest-hit region—use robots as their avatars, enabling them to check on their patients around the clock while minimizing exposure and conserving protective equipment. The robots, developed by Chinese firm Sanbot, are equipped with cameras and microphones and can also access patient data like blood oxygen levels. Telepresence robots, originally designed for offices, are becoming an invaluable tool for medical workers treating highly infectious diseases like COVID-19, reducing the risk that they’ll contract the pathogen they’re fighting against.

Photo: Miguel Medina/AFP/Getty Images

HELP FROM ABOVE

Photo: Zipline

Authorities in several countries attempted to use drones to enforce lockdowns and social-distancing rules, but the effectiveness of such measures remains unclear. A better use of drones was for making deliveries. In the United States, startup Zipline deployed its fixed-wing autonomous aircraft to connect two medical facilities 17 kilometers apart. For the staff at the Huntersville Medical Center, in North Carolina, masks, gowns, and gloves literally fell from the skies. The hope is that drones like Zipline’s will one day be able to deliver other kinds of critical materials, transport test samples, and distribute drugs and vaccines.

Photos: Zipline

SPECIAL DELIVERY
It’s not quite a robot takeover, but the streets and sidewalks of dozens of cities around the world have seen a proliferation of hurrying wheeled machines. Delivery robots are now in high demand as online orders continue to skyrocket.

In Hamburg, the six-wheeled robots developed by Starship Technologies navigate using cameras, GPS, and radar to bring groceries to customers.

Photo: Christian Charisius/Picture Alliance/Getty Images

In Medellín, Colombia, a startup called Rappi deployed a fleet of robots, built by Kiwibot, to deliver takeout to people in lockdown.

Photo: Joaquin Sarmiento/AFP/Getty Images

China’s JD.com, one of the country’s largest e-commerce companies, is using 20 robots to transport goods in Changsha, Hunan province; each vehicle has 22 separate compartments, which customers unlock using face authentication.

Photos: TPG/Getty Images

LIFE THROUGH ROBOTS
Robots can’t replace real human interaction, of course, but they can help people feel more connected at a time when meetings and other social activities are mostly on hold.

In Ostend, Belgium, ZoraBots brought one of its waist-high robots, equipped with cameras, microphones, and a screen, to a nursing home, allowing residents like Jozef Gouwy to virtually communicate with loved ones despite a ban on in-person visits.

Photo: Yves Herman/Reuters

In Manila, nearly 200 high school students took turns “teleporting” into a tall wheeled robot, developed by the school’s robotics club, to walk on stage during their graduation ceremony.

Photo: Ezra Acayan/Getty Images

And while Japan’s Chiba Zoological Park was temporarily closed due to the pandemic, the zoo used an autonomous robotic vehicle called RakuRo, equipped with 360-degree cameras, to offer virtual tours to children quarantined at home.

Photo: Tomohiro Ohsumi/Getty Images

SENTRY ROBOTS
Offices, stores, and medical centers are adopting robots as enforcers of a new coronavirus code.

At Fortis Hospital in Bangalore, India, a robot called Mitra uses a thermal camera to perform a preliminary screening of patients.

Photo: Manjunath Kiran/AFP/Getty Images

In Tunisia, the police use a tanklike robot to patrol the streets of its capital city, Tunis, verifying that citizens have permission to go out during curfew hours.

Photo: Khaled Nasraoui/Picture Alliance/Getty Images

And in Singapore, the Bishan-Ang Moh Kio Park unleashed a Spot robot dog, developed by Boston Dynamics, to search for social-distancing violators. Spot won’t bark at them but will rather play a recorded message reminding park-goers to keep their distance.

Photo: Roslan Rahman/AFP/Getty Images

This article appears in the October 2020 print issue as “How Robots Became Essential Workers.” Continue reading

Posted in Human Robots

#437639 Boston Dynamics’ Spot Is Helping ...

In terms of places where you absolutely want a robot to go instead of you, what remains of the utterly destroyed Chernobyl Reactor 4 should be very near the top of your list. The reactor, which suffered a catastrophic meltdown in 1986, has been covered up in almost every way possible in an effort to keep its nuclear core contained. But eventually, that nuclear material is going to have to be dealt with somehow, and in order to do that, it’s important to understand which bits of it are just really bad, and which bits are the actual worst. And this is where Spot is stepping in to help.

The big open space that Spot is walking through is right next to what’s left of Reactor 4. Within six months of the disaster, Reactor 4 was covered in a sarcophagus made of concrete and steel to try and keep all the nasty nuclear fuel from leaking out more than it already had, and it still contains “30 tons of highly contaminated dust, 16 tons of uranium and plutonium, and 200 tons of radioactive lava.” Oof. Over the next 10 years, the sarcophagus slowly deteriorated, and despite the addition of that gigantic network of steel support beams that you can see in the video, in the late 1990s it was decided to erect an enormous building over the entire mess to try and stabilize it for as long as possible.

Reactor 4 is now snugly inside the massive New Safe Confinement (NSC) structure, and the idea is that eventually, the structure will allow for the safe disassembly of what’s left of the reactor, although nobody is quite sure how to do that. This is all just to say that the area inside of the containment structure offers a lot of good opportunities for robots to take over from humans.

This particular Spot is owned by the U.K. Atomic Energy Authority, and was packed off to Russia with the assistance of the Robotics and Artificial Intelligence in Nuclear (RAIN) initiative and the National Centre for Nuclear Robotics. Dr. Dave Megson-Smith, who is a researcher at the University of Bristol, in the U.K., and part of the Hot Robotics Facility at the National Nuclear User Facility, was one of the scientists lucky enough to accompany Spot on its adventure. Megson-Smith specializes in sensor development, and he equipped Spot with a collimated radiation sensor in addition to its mapping payload. “We actually built a map of the radiation coming out of the front wall of Chernobyl power plant as we were in there with it,” Megson-Smith told us, and was able to share this picture, which shows a map of gamma photon count rate:

Image: University of Bristol

Researchers equipped Spot with a collimated radiation sensor and use one of the data readings (gamma photon count rate) to create a map of the radiation coming out of the front wall of the Chernobyl power plant.

So what’s the reason you’d want to use a very expensive legged robot to wander around what looks like a very flat and robot friendly floor? As it turns out, the floor is very dusty in there, and a priority inside the NSC is to keep dust down as much as possible, since the dust is radioactive and gets on everything and is consequently the easiest way for radioactivity to escape the NSC. “You want to minimize picking up material, so we consider the total contact surface area,” says Megson-Smith. “If you use a legged system rather than a wheeled or tracked system, you have a much smaller footprint and you disturb the environment a lot less.” While it’s nice that Spot is nimble and can climb stairs and stuff, tracked vehicles can do that as well, so in this case, the primary driving factor of choosing a robot to work inside Chernobyl is minimizing those contact points.

Right now, routine weekly measurements in contaminated spaces at Chernobyl are done by humans, which puts those humans at risk. Spot, or a robot like it, could potentially take over from those humans, as a sort of “automated safety checker”

Right now, routine weekly measurements in contaminated spaces at Chernobyl are done by humans, which puts those humans at risk. Spot, or a robot like it, could potentially take over from those humans, as a sort of “automated safety checker” able to work in medium level contaminated environments.” As far as more dangerous areas go, there’s a lot of uncertainty about what Spot is actually capable of, according to Megson-Smith. “What you think the problems are, and what the industry thinks the problems are, are subtly different things.

We were thinking that we’d have to make robots incredibly radiation proof to go into these contaminated environments, but they said, “can you just give us a system that we can send into places where humans already can go, but where we just don’t want to send humans.” Making robots incredibly radiation proof is challenging, and without extensive testing and ruggedizing, failures can be frequent, as many robots discovered at Fukushima. Indeed, Megson-Smith that in Fukushima there’s a particular section that’s known as a “robot graveyard” where robots just go to die, and they’ve had to up their standards again and again to keep the robots from failing. “So the thing they’re worried about with Spot is, what is its tolerance? What components will fail, and what can we do to harden it?” he says. “We’re approaching Boston Dynamics at the moment to see if they’ll work with us to address some of those questions.

There’s been a small amount of testing of how robots fair under harsh radiation, Megson-Smith told us, including (relatively recently) a KUKA LBR800 arm, which “stopped operating after a large radiation dose of 164.55(±1.09) Gy to its end effector, and the component causing the failure was an optical encoder.” And in case you’re wondering how much radiation that is, a 1 to 2 Gy dose to the entire body gets you acute radiation sickness and possibly death, while 8 Gy is usually just straight-up death. The goal here is not to kill robots (I mean, it sort of is), but as Megson-Smith says, “if we can work out what the weak points are in a robotic system, can we address those, can we redesign those, or at least understand when they might start to fail?” Now all he has to do is convince Boston Dynamics to send them a Spot that they can zap until it keels over.

The goal for Spot in the short term is fully autonomous radiation mapping, which seems very possible. It’ll also get tested with a wider range of sensor packages, and (happily for the robot) this will all take place safely back at home in the U.K. As far as Chernobyl is concerned, robots will likely have a substantial role to play in the near future. “Ultimately, Chernobyl has to be taken apart and decommissioned. That’s the long-term plan for the facility. To do that, you first need to understand everything, which is where we come in with our sensor systems and robotic platforms,” Megson-Smith tells us. “Since there are entire swathes of the Chernobyl nuclear plant where people can’t go in, we’d need robots like Spot to do those environmental characterizations.” Continue reading

Posted in Human Robots