Category Archives: Human Robots
#439136 Video Friday: Aquatic Snakebotics
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICRA 2021 – May 30-5, 2021 – [Online Event]
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
ROSCon 20201 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.
Researchers from the Biorobotics Lab in the School of Computer Science’s Robotics Institute at Carnegie Mellon University tested the hardened underwater modular robot snake (HUMRS) last month in the pool, diving the robot through underwater hoops, showing off its precise and smooth swimming, and demonstrating its ease of control.
The robot's modular design allows it to adapt to different tasks, whether squeezing through tight spaces under rubble, climbing up a tree or slithering around a corner underwater. For the underwater robot snake, the team used existing watertight modules that allow the robot to operate in bad conditions. They then added new modules containing the turbines and thrusters needed to maneuver the robot underwater.
[ CMU ]
Robots are learning how not to fall over after stepping on your foot and kicking you in the shin.
[ B-Human ]
Like boot prints on the Moon, NASA's OSIRIS-REx spacecraft left its mark on asteroid Bennu. Now, new images—taken during the spacecraft's final fly-over on April 7, 2021—reveal the aftermath of the historic Touch-and-Go (TAG) sample acquisition event from Oct. 20, 2020.
[ NASA ]
In recognition of National Robotics Week, Conan O'Brien thanks one of the robots that works for him.
[ YouTube ]
The latest from Wandercraft's self-balancing Atalante exo.
[ Wandercraft ]
Stocking supermarket shelves is one of those things that's much more difficult than it looks for robots, involving in-hand manipulation, motion planning, vision, and tactile sensing. Easy for humans, but robots are getting better.
[ Article ]
Thanks Marco!
Draganfly drone spraying Varigard disinfectant at the Smoothie King stadium. Our drone sanitization spraying technology is up to 100% more efficient and effective than conventional manual spray sterilization processes.
[ Draganfly ]
Baubot is a mobile construction robot that can do pretty much everything, apparently.
I’m pretty skeptical of robots like these; especially ones that bill themselves as platforms that can be monetized by third-party developers. From what we've seen, the most successful robots instead focus on doing one thing very well.
[ Baubot ]
In this demo, a remote operator sends an unmanned ground vehicle on an autonomous inspection mission via Clearpath’s web-based Outdoor Navigation Software.
[ Clearpath ]
Aurora’s Odysseus aircraft is a high-altitude pseudo-satellite that can change how we use the sky. At a fraction of the cost of a satellite and powered by the sun, Odysseus offers vast new possibilities for those who need to stay connected and informed.
[ Aurora ]
This video from 1999 discusses the soccer robot research activities at Carnegie Mellon University. CMUnited, the team of robots developed by Manuela Veloso and her students, won the small-size competition in both 1997 and 1998.
[ CMU ]
Thanks Fan!
This video propose an overview of our participation to the DARPA subterranean challenge, with a focus on the urban edition taking place Feb. 18-27, 2020, at Satsop Business Park west of Olympia, Washington.
[ Norlab ]
In today’s most advanced warehouses, Magazino’s autonomous robot TORU works side by side with human colleagues. The robot is specialized in picking, transporting, and stowing objects like shoe boxes in e-commerce warehouses.
[ Magazino ]
A look at the Control Systems Lab at the National Technical University of Athens.
[ CSL ]
Thanks Fan!
Doug Weber of MechE and the Neuroscience Institute discusses his group’s research on harnessing the nervous system's ability to control not only our bodies, but the machines and prostheses that can enhance our bodies, especially for those with disabilities.
[ CMU ]
Mark Yim, Director of the GRASP Lab at UPenn, gives a talk on “Is Cost Effective Robotics Interesting?” Yes, yes it is.
Robotic technologies have shown the capability to do amazing things. But many of those things are too expensive to be useful in any real sense. Cost reduction has often been shunned by research engineers and scientists in academia as “just engineering.” For robotics to make a larger impact on society the cost problem must be addressed.
[ CMU ]
There are all kinds of “killer robots” debates going on, but if you want an informed, grounded, nuanced take on AI and the future of war-fighting, you want to be watching debates like these instead. Professor Rebecca Crootof speaks with Brigadier General Patrick Huston, Assistant Judge Advocate General for Military Law and Operations, at Duke Law School's 26th Annual National Security Law conference.
[ Lawfire ]
This week’s Lockheed Martin Robotics Seminar is by Julie Adams from Oregon State, on “Human-Collective Teams: Algorithms, Transparency .”
Biological inspiration for artificial systems abounds. The science to support robotic collectives continues to emerge based on their biological inspirations, spatial swarms (e.g., fish and starlings) and colonies (e.g., honeybees and ants). Developing effective human-collective teams requires focusing on all aspects of the integrated system development. Many of these fundamental aspects have been developed independently, but our focus is an integrated development process to these complex research questions. This presentation will focus on three aspects: algorithms, transparency, and resilience for collectives.
[ UMD ] Continue reading
#438745 Social robot from India
The Indian humanoid “SHALU” is able to speak in 9 Indian, and 38 foreign languages, can recognize faces, and identity people and objects!
#439127 Cobots Act Like Puppies to Better ...
Human-robot interaction goes both ways. You’ve got robots understanding (or attempting to understand) humans, as well as humans understanding (or attempting to understand) robots. Humans, in my experience, are virtually impossible to understand even under the best of circumstances. But going the other way, robots have all kinds of communication tools at their disposal. Lights, sounds, screens, haptics—there are lots of options. That doesn’t mean that robot to human (RtH) communication is easy, though, because the ideal communication modality is something that is low cost and low complexity while also being understandable to almost anyone.
One good option for something like a collaborative robot arm can be to use human-inspired gestures (since it doesn’t require any additional hardware), although it’s important to be careful when you start having robots doing human stuff, because it can set unreasonable expectations if people think of the robot in human terms. In order to get around this, roboticists from Aachen University are experimenting with animal-like gestures for cobots instead, modeled after the behavior of puppies. Puppies!
For robots that are low-cost and appearance-constrained, animal-inspired (zoomorphic) gestures can be highly effective at state communication. We know this because of tails on Roombas:
While this is an adorable experiment, adding tails to industrial cobots is probably not going to happen. That’s too bad, because humans have an intuitive understanding of dog gestures, and this extends even to people who aren’t dog owners. But tails aren’t necessary for something to display dog gestures; it turns out that you can do it with a standard robot arm:
In a recent preprint in IEEE Robotics and Automation Letters (RA-L), first author Vanessa Sauer used puppies to inspire a series of communicative gestures for a Franka Emika Panda arm. Specifically, the arm was to be used in a collaborative assembly task, and needed to communicate five states to the human user, including greeting the user, prompting the user to take a part, waiting for a new command, an error condition when a container was empty of parts, and then shutting down. From the paper:
For each use case, we mirrored the intention of the robot (e.g., prompting the user to take a part) to an intention, a dog may have (e.g., encouraging the owner to play). In a second step, we collected gestures that dogs use to express the respective intention by leveraging real-life interaction with dogs, online videos, and literature. We then translated the dog gestures into three distinct zoomorphic gestures by jointly applying the following guidelines inspired by:
Mimicry. We mimic specific dog behavior and body language to communicate robot states.
Exploiting structural similarities. Although the cobot is functionally designed, we exploit certain components to make the gestures more “dog-like,” e.g., the camera corresponds to the dog’s eyes, or the end-effector corresponds to the dog’s snout.
Natural flow. We use kinesthetic teaching and record a full trajectory to allow natural and flowing movements with increased animacy.
A user study comparing the zoomorphic gestures to a more conventional light display for state communication during the assembly task showed that the zoomorphic gestures were easily recognized by participants as dog-like, even if the participants weren’t dog people. And the zoomorphic gestures were also more intuitively understood than the light displays, although the classification of each gesture wasn’t perfect. People also preferred the zoomorphic gestures over more abstract gestures designed to communicate the same concept. Or as the paper puts it, “Zoomorphic gestures are significantly more attractive and intuitive and provide more joy when using.” An online version of the study is here, so give it a try and provide yourself with some joy.
While zoomorphic gestures (at least in this very preliminary research) aren’t nearly as accurate at state communication as using something like a screen, they’re appealing because they’re compelling, easy to understand, inexpensive to implement, and less restrictive than sounds or screens. And there’s no reason why you can’t use both!
For a few more details, we spoke with the first author on this paper, Vanessa Sauer.
IEEE Spectrum: Where did you get the idea for this research from, and why do you think it hasn't been more widely studied or applied in the context of practical cobots?
Vanessa Sauer: I'm a total dog person. During a conversation about dogs and how their ways of communicating with their owner has evolved over time (e.g., more expressive face, easy to understand even without owning a dog), I got the rough idea for my research. I was curious to see if this intuitive understanding many people have of dog behavior could also be applied to cobots that communicate in a similar way. Especially in social robotics, approaches utilizing zoomorphic gestures have been explored. I guess due to the playful nature, less research and applications have been done in the context of industry robots, as they often have a stronger focus on efficiency.
How complex of a concept can be communicated in this way?
In our “proof-of-concept” style approach, we used rather basic robot states to be communicated. The challenge with more complex robot states would be to find intuitive parallels in dog behavior. Nonetheless, I believe that more complex states can also be communicated with dog-inspired gestures.
How would you like to see your research be put into practice?
I would enjoy seeing zoomorphic gestures offered as modality-option on cobots, especially cobots used in industry. I think that could have the potential to reduce inhibitions towards collaborating with robots and make the interaction more fun.
Photos, Robots: Franka Emika; Dogs: iStockphoto
Zoomorphic Gestures for Communicating Cobot States, by Vanessa Sauer, Axel Sauer, and Alexander Mertens from Aachen University and TUM, will be published in
RA-L. Continue reading
#439125 Baubot comes out with two new robots to ...
Despite artificial intelligence and robotics adapting to many other areas of life and the work force, construction has long remained dominated by humans in neon caps and vests. Now, the robotics company Baubot has developed a Printstones robot, which they hope to supplement human construction workers onsite. Continue reading