Tag Archives: to

#439284 A system to benchmark the posture ...

In recent years, roboticists have developed a wide variety of robots with human-like capabilities. This includes robots with bodies that structurally resemble those of humans, also known as humanoid robots. Continue reading

Posted in Human Robots

#439271 Video Friday: NASA Sending Robots to ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers.

It’s ICRA this week, but since the full proceedings are not yet available, we’re going to wait until we can access everything to cover the conference properly. Or, as properly as we can not being in Xi’an right now.

We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboCup 2021 – June 22-28, 2021 – [Online Event]
RSS 2021 – July 12-16, 2021 – [Online Event]
Humanoids 2020 – July 19-21, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

NASA has selected the DAVINCI+ (Deep Atmosphere Venus Investigation of Noble-gases, Chemistry and Imaging +) mission as part of its Discovery program, and it will be the first spacecraft to enter the Venus atmosphere since NASA’s Pioneer Venus in 1978 and USSR’s Vega in 1985.

The mission, Deep Atmosphere Venus Investigation of Noble gases, Chemistry, and Imaging Plus, will consist of a spacecraft and a probe. The spacecraft will track motions of the clouds and map surface composition by measuring heat emission from Venus’ surface that escapes to space through the massive atmosphere. The probe will descend through the atmosphere, sampling its chemistry as well as the temperature, pressure, and winds. The probe will also take the first high-resolution images of Alpha Regio, an ancient highland twice the size of Texas with rugged mountains, looking for evidence that past crustal water influenced surface materials.

Launch is targeted for FY2030.

[ NASA ]

Skydio has officially launched their 3D Scan software, turning our favorite fully autonomous drone into a reality capture system.

Skydio held a launch event at the U.S. Space & Rocket Center and the keynote is online; it's actually a fairly interesting 20 minutes with some cool rockets thrown in for good measure.

[ Skydio ]

Space robotics is a key technology for space exploration and an enabling factor for future missions, both scientific and commercial. Underwater tests are a valuable tool for validating robotic technologies for space. In DFKI’s test basin, even large robots can be tested in simulated micro-gravity with mostly unrestricted range of motion.

[ DFKI ]

The Harvard Microrobotics Lab has developed a soft robotic hand with dexterous soft fingers capable of some impressive in-hand manipulation, starting (obviously) with a head of broccoli.

Training soft robots in simulation has been a bit of a challenge, but the researchers developed their own simulation framework that matches the real world pretty closely:

The simulation framework is avilable to download and use, and you can do some nutty things with it, like simulating tentacle basketball:

I’d pay to watch that IRL.

[ Paper ] via [ Harvard ]

Using the navigation cameras on its mast, NASA’s Curiosity Mars rover this movie of clouds just after sunset on March 28, 2021, the 3,072nd so, or Martian day, of the mission. These noctilucent, or twilight clouds, are made of water ice; ice crystals reflect the setting sun, allowing the detail in each cloud to be seen more easily.

[ JPL ]

Genesis Robotics is working on something, and that's all we know.

[ Genesis Robotics ]

To further improve the autonomous capabilities of future space robots and to advance European efforts in this field, the European Union funded the ADE project, which was completed recently in Wulsbüttel near Bremen. There, the rover “SherpaTT” of the German Research Center for Artificial Intelligence (DFKI) managed to autonomously cover a distance of 500 meters in less than three hours thanks to the successful collaboration of 14 European partners.

[ DFKI ]

For $6.50, a NEXTAGE robot will make an optimized coffee for you. In Japan, of course.

[ Impress ]

Things I’m glad a robot is doing so that I don’t have to: dross skimming.

[ Fanuc ]

Today, anyone can hail a ride to experience the Waymo Driver with our fully autonomous ride-hailing service, Waymo One. Riders Ben and Ida share their experience on one of their recent multi-stop rides. Watch as they take us along for a ride.

[ Waymo ]

The IEEE Robotics and Automation Society Town Hall 2021 featured discussion around Diversity & Inclusion, RAS CARES committee & Code of Conduct, Gender Diversity, and the Developing Country Faculty Engagement Program.

[ IEEE RAS ] Continue reading

Posted in Human Robots

#439263 Somehow This Robot Sticks to Ceilings by ...

Just when I think I’ve seen every possible iteration of climbing robot, someone comes up with a new way of getting robots to stick to things. The latest technique comes from the Bioinspired Robotics and Design Lab at UCSD, where they’ve managed to get a robot to stick to smooth surfaces using a vibrating motor attached to a flexible disk. How the heck does it work?

According to a paper just published in Advanced Intelligent Systems, it’s due to “the fluid mediated adhesive force between an oscillatory plate and a surface” rather than black magic. Obviously.

Weird, right? In the paper, the researchers explain that what’s going on here: As the 14cm diameter flexible disk vibrates at 200 Hz, it generates a thin layer of low pressure air in between itself and the surface that it’s vibrating against. Although the layer of low pressure air is less than 1 mm thick, the disk can resist 5 N of force pulling on it. You can sort of think of this as a suction effect, except that it doesn’t require the disk to be constantly sealed against a surface, meaning that the robot can move around without breaking adhesion.

Image: UCSD

The big advantage here is that this is about as simple and cheap as a smooth-surface climbing robot gets, especially at small(ish) scales. There are a couple of downsides too, though. The biggest one could be that 200 Hz is a frequency that’s well within human hearing, which probably explains that soundtrack in the video—the robot is, as the researchers put it, “inherently quite noisy.” And in contrast to some other controllable adhesion techniques, this system must be turned on at all times or it will immediately plunge to its doom.

The robot you’re looking at in the video (with a 14cm disk) seems to be the sweet spot when it comes to size—going smaller means that the motor starts taking up a disproportionate amount of weight, while going larger would likely not scale well either, with the overall system mass increasing faster than the amount of adhesion that you get. The researchers suggest that “it could be advantageous to combine several disk geometries to achieve the desired load capacity and resilience to disturbances,” but that’s one of a number of things that the researchers need to figure out to properly characterize this novel adhesion technique.

Gas-Lubricated Vibration-Based Adhesion for Robotics, by William P. Weston-Dawkes, Iman Adibnazari, Yi-Wen Hu, Michael Everman, Nick Gravish, and Michael T. Tolley, is available here. Continue reading

Posted in Human Robots

#439243 Scientists Added a Sense of Touch to a ...

Most people probably underestimate how much our sense of touch helps us navigate the world around us. New research has made it crystal clear after a robotic arm with the ability to feel was able to halve the time it took for the user to complete tasks.

In recent years, rapid advances in both robotics and neural interfaces have brought the dream of bionic limbs (like the one sported by Luke Skywalker in the Star Wars movies) within touching distance. In 2019, researchers even unveiled a robotic prosthetic arm with a sense of touch that the user could control with their thoughts alone.

But so far, these devices have typically relied on connecting to nerves and muscles in the patient’s residual upper arm. That has meant the devices don’t work for those who have been paralyzed or whose injuries have caused too much damage to those tissues.

That may be about to change, though. For the first time, researchers have allowed a patient to control a robotic arm using a direct connection to their brain while simultaneously receiving sensory information from the device. And by closing the loop, the patient was able to complete tasks in half the time compared to controlling the arm without any feedback.

“The control is so intuitive that I’m basically just thinking about things as if I were moving my own arm,” patient Nathan Copeland, who has been working with researchers at the University of Pittsburgh for six years, told NPR.

The results, reported in Science, build on previous work from the same team that showed they could use implants in Copeland’s somatosensory cortex to trigger sensations localized to regions of his hand, despite him having lost feeling and control thanks to a spinal cord injury.

The 28-year-old had also previously controlled an external robotic arm using a neural interface wired up to his motor cortex, but in the latest experiment the researchers combined the two strands of research, with impressive results.

In a series of tasks designed to test dexterity, including moving objects of different shapes and sizes and pouring from one cup to another, Copeland was able to reduce the time he took to complete these tasks from a median of 20 seconds to just 10, and his performance was often equivalent to that of an able-bodied person.

The sensory information that Copeland receives from the arm is still fairly rudimentary. Sensors measure torque in the joints at the base of the robotic fingers, which is then translated into electrical signals and transmitted to the brain. He reported that the feedback didn’t feel natural, but more like pressure or a gentle tingling.

But that’s still a lot more information than cab be gleaned from simply watching the hand’s motions, which is all he had to go on before. And the approach required almost no training, unlike other popular approaches based on sensory substitution that stimulate a patch of skin or provide visual or audio cues that the patient has to learn to associate with tactile sensations.

“We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be,” Robert Gaunt, a co-author of the paper, said in a press release.

“When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way.”

An external robotic arm is still a long way from a properly integrated prosthetic, and it will likely require significant work to squeeze all the required technology into a more portable package. But Bolu Ajiboye, a neural engineer from Case Western Reserve University, told Wired that providing realistic sensory signals directly to the brain, and in particular ones that are relayed in real time, is a significant advance.

In a related perspective in Science, Aldo Faisal of Imperial College London said that the integration of a sense of touch may not only boost the performance of prosthetics, but also give patients a greater sense of ownership over their replacement limbs.

The breakthrough, he added, also opens up a host of interesting lines of scientific inquiry, including whether similar approaches could help advance robotics or be used to augment human perception with non-biological sensors.

Image Credit: RAEng_Publications from Pixabay Continue reading

Posted in Human Robots

#439230 Using a virtual linkage representation ...

A team of researchers at Yale University has developed a new kind of algorithm to improve the functionally of a robot hand. In their paper published in the journal Science Robotics, the group describes their algorithm and then demonstrate, via videos, how it can be used. Continue reading

Posted in Human Robots