Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#439030 How tiny machines become capable of ...

Living organisms, from bacteria to animals and humans, can perceive their environment and process, store and retrieve this information. They learn how to react to later situations using appropriate actions. A team of physicists at Leipzig University led by Professor Frank Cichos, in collaboration with colleagues at Charles University Prague, have developed a method for giving tiny artificial microswimmers a certain ability to learn using machine learning algorithms. They recently published a paper on this topic in the journal Science Robotics. Continue reading

Posted in Human Robots

#439023 In ‘Klara and the Sun,’ We Glimpse ...

In a store in the center of an unnamed city, humanoid robots are displayed alongside housewares and magazines. They watch the fast-moving world outside the window, anxiously awaiting the arrival of customers who might buy them and take them home. Among them is Klara, a particularly astute robot who loves the sun and wants to learn as much as possible about humans and the world they live in.

So begins Kazuo Ishiguro’s new novel Klara and the Sun, published earlier this month. The book, told from Klara’s perspective, portrays an eerie future society in which intelligent machines and other advanced technologies have been integrated into daily life, but not everyone is happy about it.

Technological unemployment, the progress of artificial intelligence, inequality, the safety and ethics of gene editing, increasing loneliness and isolation—all of which we’re grappling with today—show up in Ishiguro’s world. It’s like he hit a fast-forward button, mirroring back to us how things might play out if we don’t approach these technologies with caution and foresight.

The wealthy genetically edit or “lift” their children to set them up for success, while the poor have to make do with the regular old brains and bodies bequeathed them by evolution. Lifted and unlifted kids generally don’t mix, and this is just one of many sinister delineations between a new breed of haves and have-nots.

There’s anger about robots’ steady infiltration into everyday life, and questions about how similar their rights should be to those of humans. “First they take the jobs. Then they take the seats at the theater?” one woman fumes.

References to “changes” and “substitutions” allude to an economy where automation has eliminated millions of jobs. While “post-employed” people squat in abandoned buildings and fringe communities arm themselves in preparation for conflict, those whose livelihoods haven’t been destroyed can afford to have live-in housekeepers and buy Artificial Friends (or AFs) for their lonely children.

“The old traditional model that we still live with now—where most of us can get some kind of paid work in exchange for our services or the goods we make—has broken down,” Ishiguro said in a podcast discussion of the novel. “We’re not talking just about the difference between rich and poor getting bigger. We’re talking about a gap appearing between people who participate in society in an obvious way and people who do not.”

He has a point; as much as techno-optimists claim that the economic changes brought by automation and AI will give us all more free time, let us work less, and devote time to our passion projects, how would that actually play out? What would millions of “post-employed” people receiving basic income actually do with their time and energy?

In the novel, we don’t get much of a glimpse of this side of the equation, but we do see how the wealthy live. After a long wait, just as the store manager seems ready to give up on selling her, Klara is chosen by a 14-year-old girl named Josie, the daughter of a woman who wears “high-rank clothes” and lives in a large, sunny home outside the city. Cheerful and kind, Josie suffers from an unspecified illness that periodically flares up and leaves her confined to her bed for days at a time.

Her life seems somewhat bleak, the need for an AF clear. In this future world, the children of the wealthy no longer go to school together, instead studying alone at home on their digital devices. “Interaction meetings” are set up for them to learn to socialize, their parents carefully eavesdropping from the next room and trying not to intervene when there’s conflict or hurt feelings.

Klara does her best to be a friend, aide, and confidante to Josie while continuing to learn about the world around her and decode the mysteries of human behavior. We surmise that she was programmed with a basic ability to understand emotions, which evolves along with her other types of intelligence. “I believe I have many feelings. The more I observe, the more feelings become available to me,” she explains to one character.

Ishiguro does an excellent job of representing Klara’s mind: a blend of pre-determined programming, observation, and continuous learning. Her narration has qualities both robotic and human; we can tell when something has been programmed in—she “Gives Privacy” to the humans around her when that’s appropriate, for example—and when she’s figured something out for herself.

But the author maintains some mystery around Klara’s inner emotional life. “Does she actually understand human emotions, or is she just observing human emotions and simulating them within herself?” he said. “I suppose the question comes back to, what are our emotions as human beings? What do they amount to?”

Klara is particularly attuned to human loneliness, since she essentially was made to help prevent it. It is, in her view, peoples’ biggest fear, and something they’ll go to great lengths to avoid, yet can never fully escape. “Perhaps all humans are lonely,” she says.

Warding off loneliness through technology isn’t a futuristic idea, it’s something we’ve been doing for a long time, with the technologies at hand growing more and more sophisticated. Products like AFs already exist. There’s XiaoIce, a chatbot that uses “sentiment analysis” to keep its 660 million users engaged, and Azuma Hikari, a character-based AI designed to “bring comfort” to users whose lives lack emotional connection with other humans.

The mere existence of these tools would be sinister if it wasn’t for their widespread adoption; when millions of people use AIs to fill a void in their lives, it raises deeper questions about our ability to connect with each other and whether technology is building it up or tearing it down.

This isn’t the only big question the novel tackles. An overarching theme is one we’ve been increasingly contemplating as computers start to acquire more complex capabilities, like the beginnings of creativity or emotional awareness: What is it that truly makes us human?

“Do you believe in the human heart?” one character asks. “I don’t mean simply the organ, obviously. I’m speaking in the poetic sense. The human heart. Do you think there is such a thing? Something that makes each of us special and individual?”

The alternative, at least in the story, is that people don’t have a unique essence, but rather we’re all a blend of traits and personalities that can be reduced to strings of code. Our understanding of the brain is still elementary, but at some level, doesn’t all human experience boil down to the firing of billions of neurons between our ears? Will we one day—in a future beyond that painted by Ishiguro, but certainly foreshadowed by it—be able to “decode” our humanity to the point that there’s nothing mysterious left about it? “A human heart is bound to be complex,” Klara says. “But it must be limited.”

Whether or not you agree, Klara and the Sun is worth the read. It’s both a marvelous, engaging story about what it means to love and be human, and a prescient warning to approach technological change with caution and nuance. We’re already living in a world where AI keeps us company, influences our behavior, and is wreaking various forms of havoc. Ishiguro’s novel is a snapshot of one of our possible futures, told through the eyes of a robot who keeps you rooting for her to the end.

Image Credit: Marion Wellmann from Pixabay Continue reading

Posted in Human Robots

#437423 Robonaut2 joins ISS (2011)

Space Shuttle Discovery carried the humanoid robot Robonaut2 (also known as R2) to the International Space Station (ISS) as part of STS-133. Robonaut2 originally consisted only of a torso, made out of nickel-plated carbon fiber and aluminum. A pair of … Continue reading

Posted in Human Robots

#439012 Video Friday: Man-Machine Synergy ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Man-Machine Synergy Effectors, Inc. is a Japanese company working on an absolutely massive “human machine synergistic effect device,” which is a huge robot controlled by a nearby human using a haptic rig.

From the look of things, the next generation will be able to move around. Whoa.

[ MMSE ]

This method of loading and unloading AMRs without having them ever stop moving is so obvious that there must be some equally obvious reason why I've never seen it done in practice.

The LoadRunner is able to transport and sort parcels weighing up to 30 kilograms. This makes it the perfect luggage carrier for airports. These AI-driven go-carts can also work in concert as larger collectives to carry large, heavy and bulky objects. Every LoadRunner can also haul up to four passive trailers. Powered by four electric motors, the LoadRunner sharply brakes at just the right moment right in front of its destination and the payload slides from the robot onto the delivery platform.

[ Fraunhofer ] via [ Gizmodo ]

Ayato Kanada at Kyushu University wrote in to share this clever “dislocatable joint,” a way of combining continuum and rigid robots.

[ Paper ]

Thanks Ayato!

The DodgeDrone challenge revisits the popular dodgeball game in the context of autonomous drones. Specifically, participants will have to code navigation policies to fly drones between waypoints while avoiding dynamic obstacles. Drones are fast but fragile systems: as soon as something hits them, they will crash! Since objects will move towards the drone with different speeds and acceleration, smart algorithms are required to avoid them!

This could totally happen in real life, and we need to be prepared for it!

[ DodgeDrone Challenge ]

In addition to winning the Best Student Design Competition CREATIVITY Award at HRI 2021, this paper would also have won the Best Paper Title award, if that award existed.

[ Paper ]

Robots are traditionally bound by a fixed morphology during their operational lifetime, which is limited to adapting only their control strategies. Here we present the first quadrupedal robot that can morphologically adapt to different environmental conditions in outdoor, unstructured environments.

We show that the robot exploits its training to effectively transition between different morphological configurations, exhibiting substantial performance improvements over a non-adaptive approach. The demonstrated benefits of real-world morphological adaptation demonstrate the potential for a new embodied way of incorporating adaptation into future robotic designs.

[ Nature ]

A drone video shot in a Minneapolis bowling alley was hailed as an instant classic. One Hollywood veteran said it “adds to the language and vocabulary of cinema.” One IEEE Spectrum editor said “hey that's pretty cool.”

[ Bryant Lake Bowl ]

It doesn't take a robot to convince me to buy candy, but I think if I buy candy from Relay it's a business expense, right?

[ RIS ]

DARPA is making progress on its AI dogfighting program, with physical flight tests expected this year.

[ DARPA ACE ]

Unitree Robotics has realized that the Empire needs to be overthrown!

[ Unitree ]

Windhover Labs, an emerging leader in open and reliable flight software and hardware, announces the upcoming availability of its first hardware product, a low cost modular flight computer for commercial drones and small satellites.

[ Windhover ]

As robots and autonomous systems are poised to become part of our everyday lives, the University of Michigan and Ford are opening a one-of-a-kind facility where they’ll develop robots and roboticists that help make lives better, keep people safer and build a more equitable society.

[ U Michigan ]

The adaptive robot Rizon combined with a new hybrid electrostatic and gecko-inspired gripping pad developed by Stanford BDML can manipulate bulky, non-smooth items in the most effort-saving way, which broadens the applications in retail and household environments.

[ Flexiv ]

Thanks Yunfan!

I don't know why anyone would want things to get MORE icy, but if you do for some reason, you can make it happen with a Husky.

Is winter over yet?

[ Clearpath ]

Skip ahead to about 1:20 to see a pair of Gita robots following a Spot following a human like a chain of lil’ robot duckings.

[ PFF ]

Here are a couple of retro robotics videos, one showing teleoperated humanoids from 2000, and the other showing a robotic guide dog from 1976 (!)

[ Tachi Lab ]

Thanks Fan!

If you missed Chad Jenkins' talk “That Ain’t Right: AI Mistakes and Black Lives” last time, here's another opportunity to watch from Robotics Today, and it includes a top notch panel discussion at the end.

[ Robotics Today ]

Since its founding in 1979, the Robotics Institute (RI) at Carnegie Mellon University has been leading the world in robotics research and education. In the mid 1990s, RI created NREC as the applied R&D center within the Institute with a specific mission to apply robotics technology in an impactful way on real-world applications. In this talk, I will go over numerous R&D programs that I have led at NREC in the past 25 years.

[ CMU ] Continue reading

Posted in Human Robots

#439010 Video Friday: Nanotube-Powered Insect ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online Conference]
RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
Let us know if you have suggestions for next week, and enjoy today's videos.

If you’ve ever swatted a mosquito away from your face, only to have it return again (and again and again), you know that insects can be remarkably acrobatic and resilient in flight. Those traits help them navigate the aerial world, with all of its wind gusts, obstacles, and general uncertainty. Such traits are also hard to build into flying robots, but MIT Assistant Professor Kevin Yufeng Chen has built a system that approaches insects’ agility.

Chen’s actuators can flap nearly 500 times per second, giving the drone insect-like resilience. “You can hit it when it’s flying, and it can recover,” says Chen. “It can also do aggressive maneuvers like somersaults in the air.” And it weighs in at just 0.6 grams, approximately the mass of a large bumble bee. The drone looks a bit like a tiny cassette tape with wings, though Chen is working on a new prototype shaped like a dragonfly.

[ MIT ]

National Robotics Week is April 3-11, 2021!

[ NRW ]

This is in a motion capture environment, but still, super impressive!

[ Paper ]

Thanks Fan!

Why wait for Boston Dynamics to add an arm to your Spot if you can just do it yourself?

[ ETHZ ]

This video shows the deep-sea free swimming of soft robot in the South China Sea. The soft robot was grasped by a robotic arm on ‘HAIMA’ ROV and reached the bottom of the South China Sea (depth of 3,224 m). After the releasing, the soft robot was actuated with an on-board AC voltage of 8 kV at 1 Hz and demonstrated free swimming locomotion with its flapping fins.

Um, did they bring it back?

[ Nature ]

Quadruped Yuki Mini is 12 DOF robot equipped with a Raspberry Pi that runs ROS. Also, BUNNIES!

[ Lingkang Zhang ]

Thanks Lingkang!

Deployment of drone swarms usually relies on inter-agent communication or visual markers that are mounted on the vehicles to simplify their mutual detection. The vswarm package enables decentralized vision-based control of drone swarms without relying on inter-agent communication or visual fiducial markers. The results show that the drones can safely navigate in an outdoor environment despite substantial background clutter and difficult lighting conditions.

[ Vswarm ]

A conventional adopted method for operating a waiter robot is based on the static position control, where pre-defined goal positions are marked on a map. However, this solution is not optimal in a dynamic setting, such as in a coffee shop or an outdoor catering event, because the customers often change their positions. We explore an alternative human-robot interface design where a human operator communicates the identity of the customer to the robot instead. Inspired by how [a] human communicates, we propose a framework for communicating a visual goal to the robot, through interactive two-way communications.

[ Paper ]

Thanks Poramate!

In this video, LOLA reacts to undetected ground height changes, including a drop and leg-in-hole experiment. Further tests show the robustness to vertical disturbances using a seesaw. The robot is technically blind, not using any camera-based or prior information on the terrain.

[ TUM ]

RaiSim is a cross-platform multi-body physics engine for robotics and AI. It fully supports Linux, Mac OS, and Windows.

[ RaiSim ]

Thanks Fan!

The next generation of LoCoBot is here. The LoCoBot is an ROS research rover for mapping, navigation and manipulation (optional) that enables researchers, educators and students alike to focus on high level code development instead of hardware and building out lower level code. Development on the LoCoBot is simplified with open source software, full ROS-mapping and navigation packages and modular opensource Python API that allows users to move the platform as well as (optional) manipulator in as few as 10 lines of code.

[ Trossen ]

MIT Media Lab Research Specialist Dr. Kate Darling looks at how robots are portrayed in popular film and TV shows.

Kate's book, The New Breed: What Our History with Animals Reveals about Our Future with Robots can be pre-ordered now and comes out next month.

[ Kate Darling ]

The current autonomous mobility systems for planetary exploration are wheeled rovers, limited to flat, gently-sloping terrains and agglomerate regolith. These vehicles cannot tolerate instability and operate within a low-risk envelope (i.e., low-incline driving to avoid toppling). Here, we present ‘Mars Dogs’ (MD), four-legged robotic dogs, the next evolution of extreme planetary exploration.

[ Team CoSTAR ]

In 2020, first-year PhD students at the MIT Media Lab were tasked with a special project—to reimagine the Lab and write sci-fi stories about the MIT Media Lab in the year 2050. “But, we are researchers. We don't only write fiction, we also do science! So, we did what scientists do! We used a secret time machine under the MIT dome to go to the year 2050 and see what’s going on there! Luckily, the Media Lab still exists and we met someone…really cool!” Enjoy this interview of Cyber Joe, AI Mentor for MIT Media Lab Students of 2050.

[ MIT ]

In this talk, we will give an overview of the diverse research we do at CSIRO’s Robotics and Autonomous Systems Group and delve into some specific technologies we have developed including SLAM and Legged robotics. We will also give insights into CSIRO’s participation in the current DARPA Subterranean Challenge where we are deploying a fleet of heterogeneous robots into GPS-denied unknown underground environments.

[ GRASP Seminar ]

Marco Hutter (ETH) and Hae-Won Park (KAIST) talk about “Robotics Inspired by Nature.”

[ Swiss-Korean Science Club ]

Thanks Fan!

In this keynote, Guy Hoffman Assistant Professor and the Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering at Cornell University, discusses “The Social Uncanny of Robotic Companions.”

[ Designerly HRI ] Continue reading

Posted in Human Robots