Tag Archives: fingers

#439541 A tactile sensing mechanism for soft ...

In recent years, numerous roboticists worldwide have been trying to develop robotic systems that can artificially replicate the human sense of touch. In addition, they have been trying to create increasingly realistic and advanced bionic limbs and humanoid robots, using soft materials instead of rigid structures. Continue reading

Posted in Human Robots

#439461 Video Friday: Fluidic Fingers

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Humanoids 2020 – July 19-21, 2021 – [Online Event]
RO-MAN 2021 – August 8-12, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

This 3D printed hand uses fluidic circuits (which respond differently to different input pressures) to create a soft robotic hand that only needs one input source to actuate three fingers independently.

[ UMD ]

Thanks, Fan!

Nano quadcopters are ideal for gas source localization (GSL) as they are safe, agile and inexpensive. However, their extremely restricted sensors and computational resources make GSL a daunting challenge. In this work, we propose a novel bug algorithm named ‘Sniffy Bug’, which allows a fully autonomous swarm of gas-seeking nano quadcopters to localize a gas source in an unknown, cluttered and GPS-denied environments.

[ MAVLab ]

Large-scale aerial deployment of miniature sensors in tough environmental conditions requires a deployment device that is lightweight, robust and steerable. We present a novel samara-inspired autorotating craft that is capable of autorotating and diving.

[ Paper ]

Scientists from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have recently created a new algorithm to help a robot find efficient motion plans to ensure physical safety of its human counterpart. In this case, the bot helped put a jacket on a human, which could potentially prove to be a powerful tool in expanding assistance for those with disabilities or limited mobility.

[ MIT CSAIL ]

Listening to the language here about SoftBank's Whiz cleaning robot, I’ve got some concerns.

My worry is that the value that the robot is adding here is mostly in perception of cleaning, rather than actually, you know, cleaning. Which is still value, and that’s fine, but whether it’s long term commercially viable is less certain.

[ SoftBank ]

This paper presents a novel method for multi-legged robots to probe and test the terrain for collapses using its legs while walking. The proposed method improves on existing terrain probing approaches, and integrates the probing action into a walking cycle. A follow the-leader strategy with a suitable gait and stance is presented and implemented on a hexapod robot.

[ CSIRO ]

Robotics researchers from NVIDIA and University of Southern California presented their work at the 2021 Robotics: Science and Systems (RSS) conference called DiSECt, the first differentiable simulator for robotic cutting. The simulator accurately predicts the forces acting on a knife as it presses and slices through natural soft materials, such as fruits and vegetables.

[ NVIDIA ]

These videos from Moley Robotics have too many cuts in them to properly judge how skilled the robot is, but as far as I know, it only cooks the “perfect” steak in the sense that it will cook a steak of a given weight for a given time.

[ Moley ]

Most hands are designed for general purpose, as it’s very tedious to make task-specific hands. Existing methods battle trade-offs between the complexity of designs critical for contact-rich tasks, and the practical constraints of manufacturing, and contact handling.

This led researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) to create a new method to computationally optimize the shape and control of a robotic manipulator for a specific task. Their system uses software to manipulate the design, simulate the robot doing a task, and then provide an optimization score to assess the design and control.

[ MIT CSAIL ]

Drone Adventures maps wildlife in Namibia from above.

[ Drone Adventures ]

Some impressive electronics disassembly tasks using a planner that just unscrews things, shakes them, and sees whether it then needs to unscrew more things.

[ Imagine ]

The reconfigurable robot ReQuBiS can very well transition into biped, quadruped and snake configurations without the need of re-arranging modules, unlike most state-of-the-art models. Its design allows the robot to split into two agents to perform tasks in parallel for biped and snake mobility.

[ Paper ] via [ IvLabs ]

Thanks, Fan!

World Vision Kenya aims to improve the climate resilience of nine villages in Tana River County, sustainably manage the ecosystem and climate change, and restore the communities’ livelihoods by reseeding the hotspot areas with indigenous trees, covering at least 250 acres for every village. This can be challenging to achieve, considering the vast areas needing coverage. That’s why World Vision Kenya partnered with Kenya Flying Labs to help make this process faster, easier, and more efficient (and more fun!).

[ WeRobotics ]

Pieter Abbeel’s Robot Brains Podcast has started posting video versions of the episodes, if you’re into that sort of thing. There are interesting excerpts as well, a few of which we can share here.

[ Robot Brains ]

RSS took place this week with paper presentations, talks, Q&As, and more, but here are two of the keynotes that are definitely worth watching.

[ RSS 2021 ] Continue reading

Posted in Human Robots

#439418 Video Friday: Fluidic Fingers

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Humanoids 2020 – July 19-21, 2021 – [Online Event]
RO-MAN 2021 – August 8-12, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

This 3D printed hand uses fluidic circuits (which respond differently to different input pressures) to create a soft robotic hand that only needs one input source to actuate three fingers independently.

[ UMD ]

Thanks, Fan!

Nano quadcopters are ideal for gas source localization (GSL) as they are safe, agile and inexpensive. However, their extremely restricted sensors and computational resources make GSL a daunting challenge. In this work, we propose a novel bug algorithm named ‘Sniffy Bug’, which allows a fully autonomous swarm of gas-seeking nano quadcopters to localize a gas source in an unknown, cluttered and GPS-denied environments.

[ MAVLab ]

Large-scale aerial deployment of miniature sensors in tough environmental conditions requires a deployment device that is lightweight, robust and steerable. We present a novel samara-inspired autorotating craft that is capable of autorotating and diving.

[ Paper ]

Scientists from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have recently created a new algorithm to help a robot find efficient motion plans to ensure physical safety of its human counterpart. In this case, the bot helped put a jacket on a human, which could potentially prove to be a powerful tool in expanding assistance for those with disabilities or limited mobility.

[ MIT CSAIL ]

Listening to the language here about SoftBank's Whiz cleaning robot, I’ve got some concerns.

My worry is that the value that the robot is adding here is mostly in perception of cleaning, rather than actually, you know, cleaning. Which is still value, and that’s fine, but whether it’s long term commercially viable is less certain.

[ SoftBank ]

This paper presents a novel method for multi-legged robots to probe and test the terrain for collapses using its legs while walking. The proposed method improves on existing terrain probing approaches, and integrates the probing action into a walking cycle. A follow the-leader strategy with a suitable gait and stance is presented and implemented on a hexapod robot.

[ CSIRO ]

Robotics researchers from NVIDIA and University of Southern California presented their work at the 2021 Robotics: Science and Systems (RSS) conference called DiSECt, the first differentiable simulator for robotic cutting. The simulator accurately predicts the forces acting on a knife as it presses and slices through natural soft materials, such as fruits and vegetables.

[ NVIDIA ]

These videos from Moley Robotics have too many cuts in them to properly judge how skilled the robot is, but as far as I know, it only cooks the “perfect” steak in the sense that it will cook a steak of a given weight for a given time.

[ Moley ]

Most hands are designed for general purpose, as it’s very tedious to make task-specific hands. Existing methods battle trade-offs between the complexity of designs critical for contact-rich tasks, and the practical constraints of manufacturing, and contact handling.

This led researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) to create a new method to computationally optimize the shape and control of a robotic manipulator for a specific task. Their system uses software to manipulate the design, simulate the robot doing a task, and then provide an optimization score to assess the design and control.

[ MIT CSAIL ]

Drone Adventures maps wildlife in Namibia from above.

[ Drone Adventures ]

Some impressive electronics disassembly tasks using a planner that just unscrews things, shakes them, and sees whether it then needs to unscrew more things.

[ Imagine ]

The reconfigurable robot ReQuBiS can very well transition into biped, quadruped and snake configurations without the need of re-arranging modules, unlike most state-of-the-art models. Its design allows the robot to split into two agents to perform tasks in parallel for biped and snake mobility.

[ Paper ] via [ IvLabs ]

Thanks, Fan!

World Vision Kenya aims to improve the climate resilience of nine villages in Tana River County, sustainably manage the ecosystem and climate change, and restore the communities’ livelihoods by reseeding the hotspot areas with indigenous trees, covering at least 250 acres for every village. This can be challenging to achieve, considering the vast areas needing coverage. That’s why World Vision Kenya partnered with Kenya Flying Labs to help make this process faster, easier, and more efficient (and more fun!).

[ WeRobotics ]

Pieter Abbeel’s Robot Brains Podcast has started posting video versions of the episodes, if you’re into that sort of thing. There are interesting excerpts as well, a few of which we can share here.

[ Robot Brains ]

RSS took place this week with paper presentations, talks, Q&As, and more, but here are two of the keynotes that are definitely worth watching.

[ RSS 2021 ] Continue reading

Posted in Human Robots

#439042 How Scientists Used Ultrasound to Read ...

Thanks to neural implants, mind reading is no longer science fiction.

As I’m writing this sentence, a tiny chip with arrays of electrodes could sit on my brain, listening in on the crackling of my neurons firing as my hands dance across the keyboard. Sophisticated algorithms could then decode these electrical signals in real time. My brain’s inner language to plan and move my fingers could then be used to guide a robotic hand to do the same. Mind-to-machine control, voilà!

Yet as the name implies, even the most advanced neural implant has a problem: it’s an implant. For electrodes to reliably read the brain’s electrical chatter, they need to pierce through the its protective membrane and into brain tissue. Danger of infection aside, over time, damage accumulates around the electrodes, distorting their signals or even rendering them unusable.

Now, researchers from Caltech have paved a way to read the brain without any physical contact. Key to their device is a relatively new superstar in neuroscience: functional ultrasound, which uses sound waves to capture activity in the brain.

In monkeys, the technology could reliably predict their eye movement and hand gestures after just a single trial—without the usual lengthy training process needed to decode a movement. If adopted by humans, the new mind-reading tech represents a triple triumph: it requires minimal surgery and minimal learning, but yields maximal resolution for brain decoding. For people who are paralyzed, it could be a paradigm shift in how they control their prosthetics.

“We pushed the limits of ultrasound neuroimaging and were thrilled that it could predict movement,” said study author Dr. Sumner Norman.

To Dr. Krishna Shenoy at Stanford, who was not involved, the study will finally put ultrasound “on the map as a brain-machine interface technique. Adding to this toolkit is spectacular,” he said.

Breaking the Sound Barrier
Using sound to decode brain activity might seem preposterous, but ultrasound has had quite the run in medicine. You’ve probably heard of its most common use: taking photos of a fetus in pregnancy. The technique uses a transducer, which emits ultrasound pulses into the body and finds boundaries in tissue structure by analyzing the sound waves that bounce back.

Roughly a decade ago, neuroscientists realized they could adapt the tech for brain scanning. Rather than directly measuring the brain’s electrical chatter, it looks at a proxy—blood flow. When certain brain regions or circuits are active, the brain requires much more energy, which is provided by increased blood flow. In this way, functional ultrasound works similarly to functional MRI, but at a far higher resolution—roughly ten times, the authors said. Plus, people don’t have to lie very still in an expensive, claustrophobic magnet.

“A key question in this work was: If we have a technique like functional ultrasound that gives us high-resolution images of the brain’s blood flow dynamics in space and over time, is there enough information from that imaging to decode something useful about behavior?” said study author Dr. Mikhail Shapiro.

There’s plenty of reasons for doubt. As the new kid on the block, functional ultrasound has some known drawbacks. A major one: it gives a far less direct signal than electrodes. Previous studies show that, with multiple measurements, it can provide a rough picture of brain activity. But is that enough detail to guide a robotic prosthesis?

One-Trial Wonder
The new study put functional ultrasound to the ultimate test: could it reliably detect movement intention in monkeys? Because their brains are the most similar to ours, rhesus macaque monkeys are often the critical step before a brain-machine interface technology is adapted for humans.

The team first inserted small ultrasound transducers into the skulls of two rhesus monkeys. While it sounds intense, the surgery doesn’t penetrate the brain or its protective membrane; it’s only on the skull. Compared to electrodes, this means the brain itself isn’t physically harmed.

The device is linked to a computer, which controls the direction of sound waves and captures signals from the brain. For this study, the team aimed the pulses at the posterior parietal cortex, a part of the “motor” aspect of the brain, which plans movement. If right now you’re thinking about scrolling down this page, that’s the brain region already activated, before your fingers actually perform the movement.

Then came the tests. The first looked at eye movements—something pretty necessary before planning actual body movements without tripping all over the place. Here, the monkeys learned to focus on a central dot on a computer screen. A second dot, either left or right, then flashed. The monkeys’ task was to flicker their eyes to the most recent dot. It’s something that seems easy for us, but requires sophisticated brain computation.

The second task was more straightforward. Rather than just moving their eyes to the second target dot, the monkeys learned to grab and manipulate a joystick to move a cursor to that target.

Using brain imaging to decode the mind and control movement. Image Credit: S. Norman, Caltech
As the monkeys learned, so did the device. Ultrasound data capturing brain activity was fed into a sophisticated machine learning algorithm to guess the monkeys’ intentions. Here’s the kicker: once trained, using data from just a single trial, the algorithm was able to correctly predict the monkeys’ actual eye movement—whether left or right—with roughly 78 percent accuracy. The accuracy for correctly maneuvering the joystick was even higher, at nearly 90 percent.

That’s crazy accurate, and very much needed for a mind-controlled prosthetic. If you’re using a mind-controlled cursor or limb, the last thing you’d want is to have to imagine the movement multiple times before you actually click the web button, grab the door handle, or move your robotic leg.

Even more impressive is the resolution. Sound waves seem omnipresent, but with focused ultrasound, it’s possible to measure brain activity at a resolution of 100 microns—roughly 10 neurons in the brain.

A Cyborg Future?
Before you start worrying about scientists blasting your brain with sound waves to hack your mind, don’t worry. The new tech still requires skull surgery, meaning that a small chunk of skull needs to be removed. However, the brain itself is spared. This means that compared to electrodes, ultrasound could offer less damage and potentially a far longer mind reading than anything currently possible.

There are downsides. Focused ultrasound is far younger than any electrode-based neural implants, and can’t yet reliably decode 360-degree movement or fine finger movements. For now, the tech requires a wire to link the device to a computer, which is off-putting to many people and will prevent widespread adoption. Add to that the inherent downside of focused ultrasound, which lags behind electrical recordings by roughly two seconds.

All that aside, however, the tech is just tiptoeing into a future where minds and machines seamlessly connect. Ultrasound can penetrate the skull, though not yet at the resolution needed for imaging and decoding brain activity. The team is already working with human volunteers with traumatic brain injuries, who had to have a piece of their skulls removed, to see how well ultrasound works for reading their minds.

“What’s most exciting is that functional ultrasound is a young technique with huge potential. This is just our first step in bringing high performance, less invasive brain-machine interface to more people,” said Norman.

Image Credit: Free-Photos / Pixabay Continue reading

Posted in Human Robots

#437876 Getting the right grip: Designing soft ...

Although robotics has reshaped and even redefined many industrial sectors, there still exists a gap between machines and humans in fields such as health and elderly care. For robots to safely manipulate or interact with fragile objects and living organisms, new strategies to enhance their perception while making their parts softer are needed. In fact, building a safe and dexterous robotic gripper with human-like capabilities is currently one of the most important goals in robotics. Continue reading

Posted in Human Robots