Tag Archives: fight
#437845 Video Friday: Harmonic Bionics ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICRA 2020 – May 31-August 31, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.
Designed to protect employees and passengers from both harmful pathogens and cleaning agents, Breezy One can quickly, safely and effectively decontaminate spaces over 100,000 square feet in 1.5 hours with a patented, environmentally safe disinfectant. Breezy One was co-developed with the City of Albuquerque’s Aviation Department, where it autonomously sanitizes the Sunport’s facilities every night in the ongoing fight against COVID-19.
[ Fetch Robotics ]
Harmonic Bionics is redefining upper extremity neurorehabilitation with intelligent robotic technology designed to maximize patient recovery. Harmony SHR, our flagship product, works with a patient’s scapulohumeral rhythm (SHR) to enable natural, comprehensive therapy for both arms. When combined with Harmony’s Weight Support mode, this unique shoulder design may allow for earlier initiation of post-stroke therapy as Harmony can support a partial dislocation or subluxation of the shoulder prior to initiating traditional therapy exercises.
Harmony's Preprogrammed Exercises promotes functional treatment through patient-specific movements that can enable an increased number of repetitions per session without placing a larger physical burden on therapists or their resources. As the only rehabilitation exoskeleton with Bilateral Sync Therapy (BST), Harmony enables intent-based therapy by registering healthy arm movements and synchronizing that motion onto the stroke-affected side to help reestablish neural pathways.
[ Harmonic Bionics ]
Thanks Mok!
Some impressive work here from IHMC and IIT getting Atlas to take steps upward in a way that’s much more human-like than robot-like, which ends up reducing maximum torque requirements by 20 percent.
[ Paper ]
GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.
[ GITAI ]
Malloy Aeronautics, which now makes drones rather than hoverbikes, has been working with the Royal Navy in New Zealand to figure out how to get cargo drones to land on ships.
The challenge was to test autonomous landing of heavy lift UAVs on a moving ship, however, due to the Covid19 lockdown no ship trails were possible. The moving deck was simulated by driving a vehicle and trailer across an airfield while carrying out multiple landing and take-offs. The autonomous system partner was Planck Aerosystems and autolanding was triggered by a camera on the UAV reading a QR code on the trailer.
[ Malloy Aeronautics ]
Thanks Paul!
Tertill looks to be relentlessly effective.
[ Franklin Robotics ]
A Swedish company, TikiSafety has experienced a record amount of orders for their protective masks. At ABB, we are grateful for the opportunity to help Tiki Safety to speed up their manufacturing process from 6 minutes to 40 seconds.
[ Tiki Safety ]
The Korea Atomic Energy Research Institute is not messing around with ARMstrong, their robot for nuclear and radiation emergency response.
[ KAERI ]
OMOY is a robot that communicates with its users via internal weight shifting.
[ Paper ]
Now this, this is some weird stuff.
[ Segway ]
CaTARo is a Care Training Assistant Robot from the AIS Lab at Ritsumeikan University.
[ AIS Lab ]
Originally launched in 2015 to assist workers in lightweight assembly tasks, ABB’s collaborative YuMi robot has gone on to blaze a trail in a raft of diverse applications and industries, opening new opportunities and helping to fire people’s imaginations about what can be achieved with robotic automation.
[ ABB ]
This music video features COMAN+, from the Humanoids and Human Centered Mechatronics Lab at IIT, doing what you’d call dance moves if you dance like I do.
[ Alex Braga ] via [ IIT ]
The NVIDIA Isaac Software Development Kit (SDK) enables accelerated AI robot development workflows. Stacked with new tools and application support, Isaac SDK 2020.1 is an end-to-end solution supporting each step of robot fleet deployment, from design collaboration and training to the ongoing maintenance of AI applications.
[ NVIDIA ]
Robot Spy Komodo Dragon and Spy Pig film “a tender moment” between Komodo dragons but will they both survive the encounter?
[ BBC ] via [ Laughing Squid ]
This is part one of a mostly excellent five-part documentary about ROS produced by Red Hat. I say mostly only because they put ME in it for some reason, but fortunately, they talked with many of the core team that developed ROS back at Willow Garage back in the day, and it’s definitely worth watching.
[ Red Hat Open Source Stories ]
It’s been a while, but here’s an update on SRI’s Abacus Drive, from Alexander Kernbaum.
[ SRI ]
This Robots For Infectious Diseases interview features IEEE Fellow Antonio Bicchi, professor of robotics at the University of Pisa, talking about how Italy has been using technology to help manage COVID-19.
[ R4ID ]
Two more interviews this week of celebrity roboticists from MassRobotics: Helen Greiner and Marc Raibert. I’d introduce them, but you know who they are already!
[ MassRobotics ] Continue reading →
#437707 Video Friday: This Robot Will Restock ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.
Tokyo startup Telexistence has recently unveiled a new robot called the Model-T, an advanced teleoperated humanoid that can use tools and grasp a wide range of objects. Japanese convenience store chain FamilyMart plans to test the Model-T to restock shelves in up to 20 stores by 2022. In the trial, a human “pilot” will operate the robot remotely, handling items like beverage bottles, rice balls, sandwiches, and bento boxes.
With Model-T and AWP, FamilyMart and TX aim to realize a completely new store operation by remoteizing and automating the merchandise restocking work, which requires a large number of labor-hours. As a result, stores can operate with less number of workers and enable them to recruit employees regardless of the store’s physical location.
[ Telexistence ]
Quadruped dance-off should be a new robotics competition at IROS or ICRA.
I dunno though, that moonwalk might keep Spot in the lead…
[ Unitree ]
Through a hybrid of simulation and real-life training, this air muscle robot is learning to play table tennis.
Table tennis requires to execute fast and precise motions. To gain precision it is necessary to explore in this high-speed regimes, however, exploration can be safety-critical at the same time. The combination of RL and muscular soft robots allows to close this gap. While robots actuated by pneumatic artificial muscles generate high forces that are required for e.g. smashing, they also offer safe execution of explosive motions due to antagonistic actuation.
To enable practical training without real balls, we introduce Hybrid Sim and Real Training (HYSR) that replays prerecorded real balls in simulation while executing actions on the real system. In this manner, RL can learn the challenging motor control of the PAM-driven robot while executing ~15000 hitting motions.
[ Max Planck Institute ]
Thanks Dieter!
Anthony Cowley wrote in to share his recent thesis work on UPSLAM, a fast and lightweight SLAM technique that records data in panoramic depth images (just PNGs) that are easy to visualize and even easier to share between robots, even on low-bandwidth networks.
[ UPenn ]
Thanks Anthony!
GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.
[ Gitai ]
The University of Michigan has a fancy new treadmill that’s built right into the floor, which proves to be a bit much for Mini Cheetah.
But Cassie Blue won’t get stuck on no treadmill! She goes for a 0.3 mile walk across campus, which ends when a certain someone ran the gantry into Cassie Blue’s foot.
[ Michigan Robotics ]
Some serious quadruped research going on at UT Austin Human Centered Robotics Lab.
[ HCRL ]
Will Burrard-Lucas has spent lockdown upgrading his slightly indestructible BeetleCam wildlife photographing robot.
[ Will Burrard-Lucas ]
Teleoperated surgical robots are becoming commonplace in operating rooms, but many are massive (sometimes taking up an entire room) and are difficult to manipulate, especially if a complication arises and the robot needs to removed from the patient. A new collaboration between the Wyss Institute, Harvard University, and Sony Corporation has created the mini-RCM, a surgical robot the size of a tennis ball that weighs as much as a penny, and performed significantly better than manually operated tools in delicate mock-surgical procedures. Importantly, its small size means it is more comparable to the human tissues and structures on which it operates, and it can easily be removed by hand if needed.
[ Harvard Wyss ]
Yaskawa appears to be working on a robot that can scan you with a temperature gun and then jam a mask on your face?
[ Motoman ]
Maybe we should just not have people working in mines anymore, how about that?
[ Exyn ]
Many current human-robot interactive systems tend to use accurate and fast – but also costly – actuators and tracking systems to establish working prototypes that are safe to use and deploy for user studies. This paper presents an embedded framework to build a desktop space for human-robot interaction, using an open-source robot arm, as well as two RGB cameras connected to a Raspberry Pi-based controller that allow a fast yet low-cost object tracking and manipulation in 3D. We show in our evaluations that this facilitates prototyping a number of systems in which user and robot arm can commonly interact with physical objects.
[ Paper ]
IBM Research is proud to host professor Yoshua Bengio — one of the world’s leading experts in AI — in a discussion of how AI can contribute to the fight against COVID-19.
[ IBM Research ]
Ira Pastor, ideaXme life sciences ambassador interviews Professor Dr. Hiroshi Ishiguro, the Director of the Intelligent Robotics Laboratory, of the Department of Systems Innovation, in the Graduate School of Engineering Science, at Osaka University, Japan.
[ ideaXme ]
A CVPR talk from Stanford’s Chelsea Finn on “Generalization in Visuomotor Learning.”
[ Stanford ] Continue reading →
#437645 How Robots Became Essential Workers in ...
Photo: Sivaram V/Reuters
A robot, developed by Asimov Robotics to spread awareness about the coronavirus, holds a tray with face masks and sanitizer.
As the coronavirus emergency exploded into a full-blown pandemic in early 2020, forcing countless businesses to shutter, robot-making companies found themselves in an unusual situation: Many saw a surge in orders. Robots don’t need masks, can be easily disinfected, and, of course, they don’t get sick.
An army of automatons has since been deployed all over the world to help with the crisis: They are monitoring patients, sanitizing hospitals, making deliveries, and helping frontline medical workers reduce their exposure to the virus. Not all robots operate autonomously—many, in fact, require direct human supervision, and most are limited to simple, repetitive tasks. But robot makers say the experience they’ve gained during this trial-by-fire deployment will make their future machines smarter and more capable. These photos illustrate how robots are helping us fight this pandemic—and how they might be able to assist with the next one.
DROID TEAM
Photo: Clement Uwiringiyimana/Reuters
A squad of robots serves as the first line of defense against person-to-person transmission at a medical center in Kigali, Rwanda. Patients walking into the facility get their temperature checked by the machines, which are equipped with thermal cameras atop their heads. Developed by UBTech Robotics, in China, the robots also use their distinctive appearance—they resemble characters out of a Star Wars movie—to get people’s attention and remind them to wash their hands and wear masks.
Photo: Clement Uwiringiyimana/Reuters
SAY “AAH”
To speed up COVID-19 testing, a team of Danish doctors and engineers at the University of Southern Denmark and at Lifeline Robotics is developing a fully automated swab robot. It uses computer vision and machine learning to identify the perfect target spot inside the person’s throat; then a robotic arm with a long swab reaches in to collect the sample—all done with a swiftness and consistency that humans can’t match. In this photo, one of the creators, Esben Østergaard, puts his neck on the line to demonstrate that the robot is safe.
Photo: University of Southern Denmark
GERM ZAPPER
After six of its doctors became infected with the coronavirus, the Sassarese hospital in Sardinia, Italy, tightened its safety measures. It also brought in the robots. The machines, developed by UVD Robots, use lidar to navigate autonomously. Each bot carries an array of powerful short-wavelength ultraviolet-C lights that destroy the genetic material of viruses and other pathogens after a few minutes of exposure. Now there is a spike in demand for UV-disinfection robots as hospitals worldwide deploy them to sterilize intensive care units and operating theaters.
Photo: UVD Robots
RUNNING ERRANDS
In medical facilities, an ideal role for robots is taking over repetitive chores so that nurses and physicians can spend their time doing more important tasks. At Shenzhen Third People’s Hospital, in China, a robot called Aimbot drives down the hallways, enforcing face-mask and social-distancing rules and spraying disinfectant. At a hospital near Austin, Texas, a humanoid robot developed by Diligent Robotics fetches supplies and brings them to patients’ rooms. It repeats this task day and night, tirelessly, allowing the hospital staff to spend more time interacting with patients.
Photos, left: Diligent Robotics; Right: UBTech Robotics
THE DOCTOR IS IN
Nurses and doctors at Circolo Hospital in Varese, in northern Italy—the country’s hardest-hit region—use robots as their avatars, enabling them to check on their patients around the clock while minimizing exposure and conserving protective equipment. The robots, developed by Chinese firm Sanbot, are equipped with cameras and microphones and can also access patient data like blood oxygen levels. Telepresence robots, originally designed for offices, are becoming an invaluable tool for medical workers treating highly infectious diseases like COVID-19, reducing the risk that they’ll contract the pathogen they’re fighting against.
Photo: Miguel Medina/AFP/Getty Images
HELP FROM ABOVE
Photo: Zipline
Authorities in several countries attempted to use drones to enforce lockdowns and social-distancing rules, but the effectiveness of such measures remains unclear. A better use of drones was for making deliveries. In the United States, startup Zipline deployed its fixed-wing autonomous aircraft to connect two medical facilities 17 kilometers apart. For the staff at the Huntersville Medical Center, in North Carolina, masks, gowns, and gloves literally fell from the skies. The hope is that drones like Zipline’s will one day be able to deliver other kinds of critical materials, transport test samples, and distribute drugs and vaccines.
Photos: Zipline
SPECIAL DELIVERY
It’s not quite a robot takeover, but the streets and sidewalks of dozens of cities around the world have seen a proliferation of hurrying wheeled machines. Delivery robots are now in high demand as online orders continue to skyrocket.
In Hamburg, the six-wheeled robots developed by Starship Technologies navigate using cameras, GPS, and radar to bring groceries to customers.
Photo: Christian Charisius/Picture Alliance/Getty Images
In Medellín, Colombia, a startup called Rappi deployed a fleet of robots, built by Kiwibot, to deliver takeout to people in lockdown.
Photo: Joaquin Sarmiento/AFP/Getty Images
China’s JD.com, one of the country’s largest e-commerce companies, is using 20 robots to transport goods in Changsha, Hunan province; each vehicle has 22 separate compartments, which customers unlock using face authentication.
Photos: TPG/Getty Images
LIFE THROUGH ROBOTS
Robots can’t replace real human interaction, of course, but they can help people feel more connected at a time when meetings and other social activities are mostly on hold.
In Ostend, Belgium, ZoraBots brought one of its waist-high robots, equipped with cameras, microphones, and a screen, to a nursing home, allowing residents like Jozef Gouwy to virtually communicate with loved ones despite a ban on in-person visits.
Photo: Yves Herman/Reuters
In Manila, nearly 200 high school students took turns “teleporting” into a tall wheeled robot, developed by the school’s robotics club, to walk on stage during their graduation ceremony.
Photo: Ezra Acayan/Getty Images
And while Japan’s Chiba Zoological Park was temporarily closed due to the pandemic, the zoo used an autonomous robotic vehicle called RakuRo, equipped with 360-degree cameras, to offer virtual tours to children quarantined at home.
Photo: Tomohiro Ohsumi/Getty Images
SENTRY ROBOTS
Offices, stores, and medical centers are adopting robots as enforcers of a new coronavirus code.
At Fortis Hospital in Bangalore, India, a robot called Mitra uses a thermal camera to perform a preliminary screening of patients.
Photo: Manjunath Kiran/AFP/Getty Images
In Tunisia, the police use a tanklike robot to patrol the streets of its capital city, Tunis, verifying that citizens have permission to go out during curfew hours.
Photo: Khaled Nasraoui/Picture Alliance/Getty Images
And in Singapore, the Bishan-Ang Moh Kio Park unleashed a Spot robot dog, developed by Boston Dynamics, to search for social-distancing violators. Spot won’t bark at them but will rather play a recorded message reminding park-goers to keep their distance.
Photo: Roslan Rahman/AFP/Getty Images
This article appears in the October 2020 print issue as “How Robots Became Essential Workers.” Continue reading →
#437614 Video Friday: Poimo Is a Portable ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
IROS 2020 – October 25-29, 2020 – [Online]
ROS World 2020 – November 12, 2020 – [Online]
CYBATHLON 2020 – November 13-14, 2020 – [Online]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.
Engineers at the University of California San Diego have built a squid-like robot that can swim untethered, propelling itself by generating jets of water. The robot carries its own power source inside its body. It can also carry a sensor, such as a camera, for underwater exploration.
[ UCSD ]
Thanks Ioana!
Shark Robotics, French and European leader in Unmanned Ground Vehicles, is announcing today a disinfection add-on for Boston Dynamics Spot robot, designed to fight the COVID-19 pandemic. The Spot robot with Shark’s purpose-built disinfection payload can decontaminate up to 2,000 m2 in 15 minutes, in any space that needs to be sanitized – such as hospitals, metro stations, offices, warehouses or facilities.
[ Shark Robotics ]
Here’s an update on the Poimo portable inflatable mobility project we wrote about a little while ago; while not strictly robotics, it seems like it holds some promise for rapidly developing different soft structures that robotics might find useful.
[ University of Tokyo ]
Thanks Ryuma!
Pretty cool that you can do useful force feedback teleop while video chatting through a “regular broadband Internet connection.” Although, what “regular” means to you is a bit subjective, right?
[ HEBI Robotics ]
Thanks Dave!
While NASA's Mars rover Perseverance travels through space toward the Red Planet, its nearly identical rover twin is hard at work on Earth. The vehicle system test bed (VSTB) rover named OPTIMISM is a full-scale engineering version of the Mars-bound rover. It is used to test hardware and software before the commands are sent up to the Perseverance rover.
[ NASA ]
Jacquard takes ordinary, familiar objects and enhances them with new digital abilities and experiences, while remaining true to their original purpose — like being your favorite jacket, backpack or a pair of shoes that you love to wear.
Our ambition is simple: to make life easier. By staying connected to your digital world, your things can do so much more. Skip a song by brushing your sleeve. Take a picture by tapping on a shoulder strap. Get reminded about the phone you left behind with a blink of light or a haptic buzz on your cuff.
[ Google ATAP ]
Should you attend the IROS 2020 workshop on “Planetary Exploration Robots: Challenges and Opportunities”? Of course you should!
[ Workshop ]
Kuka makes a lot of these videos where I can’t help but think that if they put as much effort into programming the robot as they did into producing the video, the result would be much more impressive.
[ Kuka ]
The Colorado School of Mines is one of the first customers to buy a Spot robot from Boston Dynamics to help with robotics research. Watch as scientists take Spot into the school's mine for the first time.
[ HCR ] via [ CNET ]
A very interesting soft(ish) actuator from Ayato Kanada at Kyushu University's Control Engineering Lab.
A flexible ultrasonic motor (FUSM), which generates linear motion as a novel soft actuator. This motor consists of a single metal cube stator with a hole and an elastic elongated coil spring inserted into the hole. When voltages are applied to piezoelectric plates on the stator, the coil spring moves back and forward as a linear slider. In the FUSM that uses the friction drive as the principle, the most important parameter for optimizing its output is the preload between the stator and slider. The coil spring has a slightly larger diameter than the stator hole and generates the preload by expanding in a radial direction. The coil springs act not only as a flexible slider but also as a resistive positional sensor. Changes in the resistance between the stator and the coil spring end are converted to a voltage and used for position detection.
[ Control Engineering Lab ]
Thanks Ayato!
We show how to use the limbs of a quadruped robot to identify fine-grained soil, representative for Martian regolith.
[ Paper ] via [ ANYmal Research ]
PR2 is serving breakfast and cleaning up afterwards. It’s slow, but all you have to do is eat and leave.
That poor PR2 is a little more naked than it's probably comfortable with.
[ EASE ]
NVIDIA researchers present a hierarchical framework that combines model-based control and reinforcement learning (RL) to synthesize robust controllers for a quadruped robot (the Unitree Laikago).
[ NVIDIA ]
What's interesting about this assembly task is that the robot is using its arm only for positioning, and doing the actual assembly with just fingers.
[ RC2L ]
In this electronics assembly application, Kawasaki's cobot duAro2 uses a tool changing station to tackle a multitude of tasks and assemble different CPU models.
Okay but can it apply thermal paste to a CPU in the right way? Personally, I find that impossible.
[ Kawasaki ]
You only need to watch this video long enough to appreciate the concept of putting a robot on a robot.
[ Impress ]
In this lecture, we’ll hear from the man behind one of the biggest robotics companies in the world, Boston Dynamics, whose robotic dog, Spot, has been used to encourage social distancing in Singapore and is now getting ready for FDA approval to be able to measure patients’ vital signs in hospitals.
[ Alan Turing Institute ]
Greg Kahn from UC Berkeley wrote in to share his recent dissertation talk on “Mobile Robot Learning.”
In order to create mobile robots that can autonomously navigate real-world environments, we need generalizable perception and control systems that can reason about the outcomes of navigational decisions. Learning-based methods, in which the robot learns to navigate by observing the outcomes of navigational decisions in the real world, offer considerable promise for obtaining these intelligent navigation systems. However, there are many challenges impeding mobile robots from autonomously learning to act in the real-world, in particular (1) sample-efficiency–how to learn using a limited amount of data? (2) supervision–how to tell the robot what to do? and (3) safety–how to ensure the robot and environment are not damaged or destroyed during learning? In this talk, I will present deep reinforcement learning methods for addressing these real world mobile robot learning challenges and show results which enable ground and aerial robots to navigate in complex indoor and outdoor environments.
[ UC Berkeley ]
Thanks Greg!
Leila Takayama from UC Santa Cruz (and previously Google X and Willow Garage) gives a talk entitled “Toward a more human-centered future of robotics.”
Robots are no longer only in outer space, in factory cages, or in our imaginations. We interact with robotic agents when withdrawing cash from bank ATMs, driving cars with adaptive cruise control, and tuning our smart home thermostats. In the moment of those interactions with robotic agents, we behave in ways that do not necessarily align with the rational belief that robots are just plain machines. Through a combination of controlled experiments and field studies, we use theories and concepts from the social sciences to explore ways that human and robotic agents come together, including how people interact with personal robots and how people interact through telepresence robots. Together, we will explore topics and raise questions about the psychology of human-robot interaction and how we could invent a future of a more human-centered robotics that we actually want to live in.
[ Leila Takayama ]
Roboticist and stand-up comedian Naomi Fitter from Oregon State University gives a talk on “Everything I Know about Telepresence.”
Telepresence robots hold promise to connect people by providing videoconferencing and navigation abilities in far-away environments. At the same time, the impacts of current commercial telepresence robots are not well understood, and circumstances of robot use including internet connection stability, odd personalizations, and interpersonal relationship between a robot operator and people co-located with the robot can overshadow the benefit of the robot itself. And although the idea of telepresence robots has been around for over two decades, available nonverbal expressive abilities through telepresence robots are limited, and suitable operator user interfaces for the robot (for example, controls that allow for the operator to hold a conversation and move the robot simultaneously) remain elusive. So where should we be using telepresence robots? Are there any pitfalls to watch out for? What do we know about potential robot expressivity and user interfaces? This talk will cover my attempts to address these questions and ways in which the robotics research community can build off of this work
[ Talking Robotics ] Continue reading →
#437579 Disney Research Makes Robotic Gaze ...
While it’s not totally clear to what extent human-like robots are better than conventional robots for most applications, one area I’m personally comfortable with them is entertainment. The folks over at Disney Research, who are all about entertainment, have been working on this sort of thing for a very long time, and some of their animatronic attractions are actually quite impressive.
The next step for Disney is to make its animatronic figures, which currently feature scripted behaviors, to perform in an interactive manner with visitors. The challenge is that this is where you start to get into potential Uncanny Valley territory, which is what happens when you try to create “the illusion of life,” which is what Disney (they explicitly say) is trying to do.
In a paper presented at IROS this month, a team from Disney Research, Caltech, University of Illinois at Urbana-Champaign, and Walt Disney Imagineering is trying to nail that illusion of life with a single, and perhaps most important, social cue: eye gaze.
Before you watch this video, keep in mind that you’re watching a specific character, as Disney describes:
The robot character plays an elderly man reading a book, perhaps in a library or on a park bench. He has difficulty hearing and his eyesight is in decline. Even so, he is constantly distracted from reading by people passing by or coming up to greet him. Most times, he glances at people moving quickly in the distance, but as people encroach into his personal space, he will stare with disapproval for the interruption, or provide those that are familiar to him with friendly acknowledgment.
What, exactly, does “lifelike” mean in the context of robotic gaze? The paper abstract describes the goal as “[seeking] to create an interaction which demonstrates the illusion of life.” I suppose you could think of it like a sort of old-fashioned Turing test focused on gaze: If the gaze of this robot cannot be distinguished from the gaze of a human, then victory, that’s lifelike. And critically, we’re talking about mutual gaze here—not just a robot gazing off into the distance, but you looking deep into the eyes of this robot and it looking right back at you just like a human would. Or, just like some humans would.
The approach that Disney is using is more animation-y than biology-y or psychology-y. In other words, they’re not trying to figure out what’s going on in our brains to make our eyes move the way that they do when we’re looking at other people and basing their control system on that, but instead, Disney just wants it to look right. This “visual appeal” approach is totally fine, and there’s been an enormous amount of human-robot interaction (HRI) research behind it already, albeit usually with less explicitly human-like platforms. And speaking of human-like platforms, the hardware is a “custom Walt Disney Imagineering Audio-Animatronics bust,” which has DoFs that include neck, eyes, eyelids, and eyebrows.
In order to decide on gaze motions, the system first identifies a person to target with its attention using an RGB-D camera. If more than one person is visible, the system calculates a curiosity score for each, currently simplified to be based on how much motion it sees. Depending on which person that the robot can see has the highest curiosity score, the system will choose from a variety of high level gaze behavior states, including:
Read: The Read state can be considered the “default” state of the character. When not executing another state, the robot character will return to the Read state. Here, the character will appear to read a book located at torso level.
Glance: A transition to the Glance state from the Read or Engage states occurs when the attention engine indicates that there is a stimuli with a curiosity score […] above a certain threshold.
Engage: The Engage state occurs when the attention engine indicates that there is a stimuli […] to meet a threshold and can be triggered from both Read and Glance states. This state causes the robot to gaze at the person-of-interest with both the eyes and head.
Acknowledge: The Acknowledge state is triggered from either Engage or Glance states when the person-of-interest is deemed to be familiar to the robot.
Running underneath these higher level behavior states are lower level motion behaviors like breathing, small head movements, eye blinking, and saccades (the quick eye movements that occur when people, or robots, look between two different focal points). The term for this hierarchical behavioral state layering is a subsumption architecture, which goes all the way back to Rodney Brooks’ work on robots like Genghis in the 1980s and Cog and Kismet in the ’90s, and it provides a way for more complex behaviors to emerge from a set of simple, decentralized low-level behaviors.
“25 years on Disney is using my subsumption architecture for humanoid eye control, better and smoother now than our 1995 implementations on Cog and Kismet.”
—Rodney Brooks, MIT emeritus professor
Brooks, an emeritus professor at MIT and, most recently, cofounder and CTO of Robust.ai, tweeted about the Disney project, saying: “People underestimate how long it takes to get from academic paper to real world robotics. 25 years on Disney is using my subsumption architecture for humanoid eye control, better and smoother now than our 1995 implementations on Cog and Kismet.”
From the paper:
Although originally intended for control of mobile robots, we find that the subsumption architecture, as presented in [17], lends itself as a framework for organizing animatronic behaviors. This is due to the analogous use of subsumption in human behavior: human psychomotor behavior can be intuitively modeled as layered behaviors with incoming sensory inputs, where higher behavioral levels are able to subsume lower behaviors. At the lowest level, we have involuntary movements such as heartbeats, breathing and blinking. However, higher behavioral responses can take over and control lower level behaviors, e.g., fight-or-flight response can induce faster heart rate and breathing. As our robot character is modeled after human morphology, mimicking biological behaviors through the use of a bottom-up approach is straightforward.
The result, as the video shows, appears to be quite good, although it’s hard to tell how it would all come together if the robot had more of, you know, a face. But it seems like you don’t necessarily need to have a lifelike humanoid robot to take advantage of this architecture in an HRI context—any robot that wants to make a gaze-based connection with a human could benefit from doing it in a more human-like way.
“Realistic and Interactive Robot Gaze,” by Matthew K.X.J. Pan, Sungjoon Choi, James Kennedy, Kyna McIntosh, Daniel Campos Zamora, Gunter Niemeyer, Joohyung Kim, Alexis Wieland, and David Christensen from Disney Research, California Institute of Technology, University of Illinois at Urbana-Champaign, and Walt Disney Imagineering, was presented at IROS 2020. You can find the full paper, along with a 13-minute video presentation, on the IROS on-demand conference website.
< Back to IEEE Journal Watch Continue reading →