Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#439136 Video Friday: Aquatic Snakebotics

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRA 2021 – May 30-5, 2021 – [Online Event]
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
ROSCon 20201 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Researchers from the Biorobotics Lab in the School of Computer Science’s Robotics Institute at Carnegie Mellon University tested the hardened underwater modular robot snake (HUMRS) last month in the pool, diving the robot through underwater hoops, showing off its precise and smooth swimming, and demonstrating its ease of control.

The robot's modular design allows it to adapt to different tasks, whether squeezing through tight spaces under rubble, climbing up a tree or slithering around a corner underwater. For the underwater robot snake, the team used existing watertight modules that allow the robot to operate in bad conditions. They then added new modules containing the turbines and thrusters needed to maneuver the robot underwater.

[ CMU ]

Robots are learning how not to fall over after stepping on your foot and kicking you in the shin.

[ B-Human ]

Like boot prints on the Moon, NASA's OSIRIS-REx spacecraft left its mark on asteroid Bennu. Now, new images—taken during the spacecraft's final fly-over on April 7, 2021—reveal the aftermath of the historic Touch-and-Go (TAG) sample acquisition event from Oct. 20, 2020.

[ NASA ]

In recognition of National Robotics Week, Conan O'Brien thanks one of the robots that works for him.

[ YouTube ]

The latest from Wandercraft's self-balancing Atalante exo.

[ Wandercraft ]

Stocking supermarket shelves is one of those things that's much more difficult than it looks for robots, involving in-hand manipulation, motion planning, vision, and tactile sensing. Easy for humans, but robots are getting better.

[ Article ]

Thanks Marco!

Draganfly​ drone spraying Varigard disinfectant at the Smoothie King stadium. Our drone sanitization spraying technology is up to 100% more efficient and effective than conventional manual spray sterilization processes.

[ Draganfly ]

Baubot is a mobile construction robot that can do pretty much everything, apparently.

I’m pretty skeptical of robots like these; especially ones that bill themselves as platforms that can be monetized by third-party developers. From what we've seen, the most successful robots instead focus on doing one thing very well.

[ Baubot ]

In this demo, a remote operator sends an unmanned ground vehicle on an autonomous inspection mission via Clearpath’s web-based Outdoor Navigation Software.

[ Clearpath ]

Aurora’s Odysseus aircraft is a high-altitude pseudo-satellite that can change how we use the sky. At a fraction of the cost of a satellite and powered by the sun, Odysseus offers vast new possibilities for those who need to stay connected and informed.

[ Aurora ]

This video from 1999 discusses the soccer robot research activities at Carnegie Mellon University. CMUnited, the team of robots developed by Manuela Veloso and her students, won the small-size competition in both 1997 and 1998.

[ CMU ]

Thanks Fan!

This video propose an overview of our participation to the DARPA subterranean challenge, with a focus on the urban edition taking place Feb. 18-27, 2020, at Satsop Business Park west of Olympia, Washington.

[ Norlab ]

In today’s most advanced warehouses, Magazino’s autonomous robot TORU works side by side with human colleagues. The robot is specialized in picking, transporting, and stowing objects like shoe boxes in e-commerce warehouses.

[ Magazino ]

A look at the Control Systems Lab at the National Technical University of Athens.

[ CSL ]

Thanks Fan!

Doug Weber of MechE and the Neuroscience Institute discusses his group’s research on harnessing the nervous system's ability to control not only our bodies, but the machines and prostheses that can enhance our bodies, especially for those with disabilities.

[ CMU ]

Mark Yim, Director of the GRASP Lab at UPenn, gives a talk on “Is Cost Effective Robotics Interesting?” Yes, yes it is.

Robotic technologies have shown the capability to do amazing things. But many of those things are too expensive to be useful in any real sense. Cost reduction has often been shunned by research engineers and scientists in academia as “just engineering.” For robotics to make a larger impact on society the cost problem must be addressed.

[ CMU ]

There are all kinds of “killer robots” debates going on, but if you want an informed, grounded, nuanced take on AI and the future of war-fighting, you want to be watching debates like these instead. Professor Rebecca Crootof speaks with Brigadier General Patrick Huston, Assistant Judge Advocate General for Military Law and Operations, at Duke Law School's 26th Annual National Security Law conference.

[ Lawfire ]

This week’s Lockheed Martin Robotics Seminar is by Julie Adams from Oregon State, on “Human-Collective Teams: Algorithms, Transparency .”

Biological inspiration for artificial systems abounds. The science to support robotic collectives continues to emerge based on their biological inspirations, spatial swarms (e.g., fish and starlings) and colonies (e.g., honeybees and ants). Developing effective human-collective teams requires focusing on all aspects of the integrated system development. Many of these fundamental aspects have been developed independently, but our focus is an integrated development process to these complex research questions. This presentation will focus on three aspects: algorithms, transparency, and resilience for collectives.

[ UMD ] Continue reading

Posted in Human Robots

#439132 This Week’s Awesome Tech Stories From ...

ARTIFICIAL INTELLIGENCE
15 Graphs You Need to See to Understand AI in 2021
Charles Q. Choi | IEEE Spectrum
“If you haven’t had time to read the AI Index Report for 2021, which clocks in at 222 pages, don’t worry—we’ve got you covered. The massive document, produced by the Stanford Institute for Human-Centered Artificial Intelligence, is packed full of data and graphs, and we’ve plucked out 15 that provide a snapshot of the current state of AI.”

FUTURE
Geoffrey Hinton Has a Hunch About What’s Next for Artificial Intelligence
Siobhan Roberts | MIT Technology Review
“Back in November, the computer scientist and cognitive psychologist Geoffrey Hinton had a hunch. After a half-century’s worth of attempts—some wildly successful—he’d arrived at another promising insight into how the brain works and how to replicate its circuitry in a computer.”

ROBOTICS
Robotic Exoskeletons Could One Day Walk by Themselves
Charles Q. Choi | IEEE Spectrum
“Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system’s analysis of a user’s current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says.”

TECHNOLOGY
Microsoft Buys AI Speech Tech Company Nuance for $19.7 Billion
James Vincent | The Verge
“The $19.7 billion acquisition of Nuance is Microsoft’s second-largest behind its purchase of LinkedIn in 2016 for $26 billion. It comes at a time when speech tech is improving rapidly, thanks to the deep learning boom in AI, and there are simultaneously more opportunities for its use.”

ENVIRONMENT
Google’s New 3D Time-Lapse Feature Shows How Humans Are Affecting the Planet
Sam Rutherford | Gizmodo
“Described by Google Earth director Rebecca Moore as the biggest update to Google Earth since 2017, Timelapse in Google Earth combines more than 24 million satellite photos, two petabytes of data, and 2 million hours of CPU processing time to create a 4.4-terapixel interactive view showing how the Earth has changed from 1984 to 2020.”

GENETICS
The Genetic Mistakes That Could Shape Our Species
Zaria Gorvett | BBC
“New technologies may have already introduced genetic errors to the human gene pool. How long will they last? And how could they affect us? …According to [Stanford’s Hank] Greely, who has written a book about the implications of He [Jiankui]’s project, the answer depends on what the edits do and how they’re inherited.”

SPACE
The Era of Reusability in Space Has Begun
Eric Berger | Ars Technica
“As [Earth orbit] becomes more cluttered [due to falling launch costs], the responsible thing is to more actively refuel, recycle, and dispose of satellites. Northrop Grumman has made meaningful progress toward such a future of satellite servicing. As a result, reusability is now moving into space.”

COMPUTING
100 Million More IoT Devices Are Exposed—and They Won’t Be the Last
Lily Hay Newman | Wired
“Over the last few years, researchers have found a shocking number of vulnerabilities in seemingly basic code that underpins how devices communicate with the internet. Now a new set of nine such vulnerabilities are exposing an estimated 100 million devices worldwide, including an array of internet-of-things products and IT management servers.”

Image Credit: Naitian (Tony) Wang / Unsplash Continue reading

Posted in Human Robots

#438745 Social robot from India

The Indian humanoid “SHALU” is able to speak in 9 Indian, and 38 foreign languages, can recognize faces, and identity people and objects!

Posted in Human Robots

#439127 Cobots Act Like Puppies to Better ...

Human-robot interaction goes both ways. You’ve got robots understanding (or attempting to understand) humans, as well as humans understanding (or attempting to understand) robots. Humans, in my experience, are virtually impossible to understand even under the best of circumstances. But going the other way, robots have all kinds of communication tools at their disposal. Lights, sounds, screens, haptics—there are lots of options. That doesn’t mean that robot to human (RtH) communication is easy, though, because the ideal communication modality is something that is low cost and low complexity while also being understandable to almost anyone.

One good option for something like a collaborative robot arm can be to use human-inspired gestures (since it doesn’t require any additional hardware), although it’s important to be careful when you start having robots doing human stuff, because it can set unreasonable expectations if people think of the robot in human terms. In order to get around this, roboticists from Aachen University are experimenting with animal-like gestures for cobots instead, modeled after the behavior of puppies. Puppies!

For robots that are low-cost and appearance-constrained, animal-inspired (zoomorphic) gestures can be highly effective at state communication. We know this because of tails on Roombas:

While this is an adorable experiment, adding tails to industrial cobots is probably not going to happen. That’s too bad, because humans have an intuitive understanding of dog gestures, and this extends even to people who aren’t dog owners. But tails aren’t necessary for something to display dog gestures; it turns out that you can do it with a standard robot arm:

In a recent preprint in IEEE Robotics and Automation Letters (RA-L), first author Vanessa Sauer used puppies to inspire a series of communicative gestures for a Franka Emika Panda arm. Specifically, the arm was to be used in a collaborative assembly task, and needed to communicate five states to the human user, including greeting the user, prompting the user to take a part, waiting for a new command, an error condition when a container was empty of parts, and then shutting down. From the paper:

For each use case, we mirrored the intention of the robot (e.g., prompting the user to take a part) to an intention, a dog may have (e.g., encouraging the owner to play). In a second step, we collected gestures that dogs use to express the respective intention by leveraging real-life interaction with dogs, online videos, and literature. We then translated the dog gestures into three distinct zoomorphic gestures by jointly applying the following guidelines inspired by:

Mimicry. We mimic specific dog behavior and body language to communicate robot states.
Exploiting structural similarities. Although the cobot is functionally designed, we exploit certain components to make the gestures more “dog-like,” e.g., the camera corresponds to the dog’s eyes, or the end-effector corresponds to the dog’s snout.
Natural flow. We use kinesthetic teaching and record a full trajectory to allow natural and flowing movements with increased animacy.

A user study comparing the zoomorphic gestures to a more conventional light display for state communication during the assembly task showed that the zoomorphic gestures were easily recognized by participants as dog-like, even if the participants weren’t dog people. And the zoomorphic gestures were also more intuitively understood than the light displays, although the classification of each gesture wasn’t perfect. People also preferred the zoomorphic gestures over more abstract gestures designed to communicate the same concept. Or as the paper puts it, “Zoomorphic gestures are significantly more attractive and intuitive and provide more joy when using.” An online version of the study is here, so give it a try and provide yourself with some joy.

While zoomorphic gestures (at least in this very preliminary research) aren’t nearly as accurate at state communication as using something like a screen, they’re appealing because they’re compelling, easy to understand, inexpensive to implement, and less restrictive than sounds or screens. And there’s no reason why you can’t use both!

For a few more details, we spoke with the first author on this paper, Vanessa Sauer.

IEEE Spectrum: Where did you get the idea for this research from, and why do you think it hasn't been more widely studied or applied in the context of practical cobots?

Vanessa Sauer: I'm a total dog person. During a conversation about dogs and how their ways of communicating with their owner has evolved over time (e.g., more expressive face, easy to understand even without owning a dog), I got the rough idea for my research. I was curious to see if this intuitive understanding many people have of dog behavior could also be applied to cobots that communicate in a similar way. Especially in social robotics, approaches utilizing zoomorphic gestures have been explored. I guess due to the playful nature, less research and applications have been done in the context of industry robots, as they often have a stronger focus on efficiency.

How complex of a concept can be communicated in this way?

In our “proof-of-concept” style approach, we used rather basic robot states to be communicated. The challenge with more complex robot states would be to find intuitive parallels in dog behavior. Nonetheless, I believe that more complex states can also be communicated with dog-inspired gestures.

How would you like to see your research be put into practice?

I would enjoy seeing zoomorphic gestures offered as modality-option on cobots, especially cobots used in industry. I think that could have the potential to reduce inhibitions towards collaborating with robots and make the interaction more fun.

Photos, Robots: Franka Emika; Dogs: iStockphoto

Zoomorphic Gestures for Communicating Cobot States, by Vanessa Sauer, Axel Sauer, and Alexander Mertens from Aachen University and TUM, will be published in
RA-L. Continue reading

Posted in Human Robots

#439125 Baubot comes out with two new robots to ...

Despite artificial intelligence and robotics adapting to many other areas of life and the work force, construction has long remained dominated by humans in neon caps and vests. Now, the robotics company Baubot has developed a Printstones robot, which they hope to supplement human construction workers onsite. Continue reading

Posted in Human Robots