Category Archives: Human Robots
#439847 Tiny hand-shaped gripper can grasp and ...
A team of researchers affiliated with a host of institutions in the Republic of Korea has developed a tiny, soft robotic hand that can grasp small objects and measure their temperature. They have published their results in the journal Science Robotics. Continue reading
#439842 AI-Powered Brain Implant Eases Severe ...
Sarah hadn’t laughed in five years.
At 36 years old, the avid home cook has struggled with depression since early childhood. She tried the whole range of antidepressant medications and therapy for decades. Nothing worked. One night, five years ago, driving home from work, she had one thought in her mind: this is it. I’m done.
Luckily she made it home safe. And soon she was offered an intriguing new possibility to tackle her symptoms—a little chip, implanted into her brain, that captures the unique neural signals encoding her depression. Once the implant detects those signals, it zaps them away with a brief electrical jolt, like adding noise to an enemy’s digital transmissions to scramble their original message. When that message triggers depression, hijacking neural communications is exactly what we want to do.
Flash forward several years, and Sarah has her depression under control for the first time in her life. Her suicidal thoughts evaporated. After quitting her tech job due to her condition, she’s now back on her feet, enrolled in data analytics classes and taking care of her elderly mother. “For the first time,” she said, “I’m finally laughing.”
Sarah’s recovery is just one case. But it signifies a new era for the technology underlying her stunning improvement. It’s one of the first cases in which a personalized “brain pacemaker” can stealthily tap into, decipher, and alter a person’s mood and introspection based on their own unique electrical brain signatures. And while those implants have achieved stunning medical miracles in other areas—such as allowing people with paralysis to walk again—Sarah’s recovery is some of the strongest evidence yet that a computer chip, in a brain, powered by AI, can fundamentally alter our perception of life. It’s the closest to reading and repairing a troubled mind that we’ve ever gotten.
“We haven’t been able to do this kind of personalized therapy previously in psychiatry,” said study lead Dr. Katherine Scangos at UCSF. “This success in itself is an incredible advancement in our knowledge of the brain function that underlies mental illness.”
Brain Pacemaker
The key to Sarah’s recovery is a brain-machine interface.
Roughly the size of a matchbox, the implant sits inside the brain, silently listening to and decoding its electrical signals. Using those signals, it’s possible to control other parts of the brain or body. Brain implants have given people with lower body paralysis the ability to walk again. They’ve allowed amputees to control robotic hands with just a thought. They’ve opened up a world of sensations, integrating feedback from cyborg-like artificial limbs that transmit signals directly into the brain.
But Sarah’s implant is different.
Sensation and movement are generally controlled by relatively well-defined circuits in the outermost layer of the brain: the cortex. Emotion and mood are also products of our brain’s electrical signals, but they tend to stem from deeper neural networks hidden at the center of the brain. One way to tap into those circuits is called deep brain stimulation (DBS), a method pioneered in the ’80s that’s been used to treat severe Parkinson’s disease and epilepsy, particularly for cases that don’t usually respond to medication.
Sarah’s neural implant takes this route: it listens in on the chatter between neurons deep within the brain to decode mood.
But where is mood in the brain? One particular problem, the authors explained, is that unlike movement, there is no “depression brain region.” Rather, emotions are regulated by intricate, intertwining networks across multiple brain regions. Adding to that complexity is the fact that we’re all neural snowflakes—each of us have uniquely personalized brain network connections.
In other words, zapping my circuit to reduce depression might not work for you. DBS, for example, has previously been studied for treating depression. But despite decades of research, it’s not federally approved due to inconsistent results. The culprit? The electrical stimulation patterns used in those trials were constant and engineered to be one-size-fits-all. Have you ever tried buying socks or PJs at a department store, seen the tag that says “one size,” and they don’t fit? Yeah. DBS has brought about remarkable improvements for some people with depression—ill-fitting socks are better than none in a pinch. But with increasingly sophisticated neuroengineering methods, we can do better.
The solution? Let’s make altering your brain more personal.
Unconscious Reprieve
That’s the route Sarah’s psychologist and UCSF neurosurgeon Dr. Edward Chang and colleagues took in the new study.
The first step in detecting depression-related activity in the brain was to be able to listen in. The team implanted 10 electrodes in Sarah’s brain, targeting multiple regions encoding emotion-related circuits. They then recorded electrical signals from these regions over the course of 10 days, while Sarah journaled about how she felt each day—happy or low. In the background, the team peeked into her brain activity patterns, a symphony of electrical signals in multiple frequencies, like overlapping waves on the ocean.
One particular brain wave emerged. It stemmed from the amygdala, a region normally involved in fear, lust, and other powerful emotions. Software-based mapping pinpointed the node as a powerful guide to Sarah’s mental state.
In contrast, another area tucked deep inside the brain, the ventral capsule/ventral striatum (VC/VS), emerged as a place to stimulate with little bouts of electricity to disrupt patterns leading to feelings of depression.
The team next implanted an FDA-approved neural pacemaker into the right brain lobe, with two sensing leads to capture activity from the amygdala and two stimulating wires to zap the VC/VS. The implant was previously used in epilepsy treatments and continuously senses neural activity. It’s both off-the-shelf and programmable, in that the authors could instruct it to detect “pre-specified patterns of activation” related to Sarah’s depressive episodes, and deliver short bursts of electrical stimulation only then. Just randomly stimulating the amygdala could “actually cause more stress and more depression symptoms,” said Dr. Chang in a press conference.
Brain surgery wasn’t easy. But to Sarah, drilling several holes into her brain was less difficult than the emotional pain of her depression. Every day during the trial, she waved a figure-eight-shaped wand over her head, which wirelessly captured 90 seconds of her brain’s electrical activity while reporting on her mental health.
When the stimulator turned on (even when she wasn’t aware it was on), “a joyous feeling just washed over me,” she said.
A New Neurological Future
For now, the results are just for one person. But if repeated—and Sarah could be a unique case—they suggest we’re finally at the point where we can tap into each unique person’s emotional mindset and fundamentally alter their perception of life.
And with that comes intense responsibility. Sarah’s neural “imprint” of her depression is tailored to her. It might be completely different for someone else. It’s something for future studies to dig into. But what’s clear is that it’s possible to regulate a person’s emotions with an AI-powered brain implant. And if other neurological disorders can be decoded in a similar way, we could use brain pacemakers to treat some of our toughest mental foes.
“God, the color differentiation is gorgeous,” said Sarah as her implant turned on. “I feel alert. I feel present.”
Image Credit: Sarah in her community garden, photo by John Lok/UCSF 2021 Continue reading
#439836 Video Friday: Dusty at Work
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ROSCon 2021 – October 20-21, 2021 – [Online Event]Silicon Valley Robot Block Party – October 23, 2021 – Oakland, CA, USALet us know if you have suggestions for next week, and enjoy today's videos.
I love watching Dusty Robotics' field printer at work. I don't know whether it's intentional or not, but it's go so much personality somehow.
[ Dusty Robotics ]
A busy commuter is ready to walk out the door, only to realize they've misplaced their keys and must search through piles of stuff to find them. Rapidly sifting through clutter, they wish they could figure out which pile was hiding the keys. Researchers at MIT have created a robotic system that can do just that. The system, RFusion, is a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper. It fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
While finding lost keys is helpful, RFusion could have many broader applications in the future, like sorting through piles to fulfill orders in a warehouse, identifying and installing components in an auto manufacturing plant, or helping an elderly individual perform daily tasks in the home, though the current prototype isn't quite fast enough yet for these uses.[ MIT ]
CSIRO Data61 had, I'm pretty sure, the most massive robots in the entire SubT competition. And this is how you solve doors with a massive robot.
[ CSIRO ]
You know how robots are supposed to be doing things that are too dangerous for humans? I think sailing through a hurricane qualifies..
This second video, also captured by this poor Saildrone, is if anything even worse:
[ Saildrone ] via [ NOAA ]
Soft Robotics can handle my taquitos anytime.
[ Soft Robotics ]
This is brilliant, if likely unaffordable for most people.
[ Eric Paulos ]
I do not understand this robot at all, nor can I tell whether it's friendly or potentially dangerous or both.
[ Keunwook Kim ]
This sort of thing really shouldn't have to exist for social home robots, but I'm glad it does, I guess?
It costs $100, though.
[ Digital Dream Labs ]
If you watch this video closely, you'll see that whenever a simulated ANYmal falls over, it vanishes from existence. This is a new technique for teaching robots to walk by threatening them with extinction if they fail.
But seriously how do I get this as a screensaver?
[ RSL ]
Zimbabwe Flying Labs' Tawanda Chihambakwe shares how Zimbabwe Flying Labs got their start, using drones for STEM programs, and how drones impact conservation and agriculture.
[ Zimbabwe Flying Labs ]
DARPA thoughtfully provides a video tour of the location of every artifact on the SubT Final prize course. Some of them are hidden extraordinarily well.
Also posted by DARPA this week are full prize round run videos for every team; here are the top three: MARBLE, CSIRO Data61, and CERBERUS.
[ DARPA SubT ]
An ICRA 2021 plenary talk from Fumihito Arai at the University of Tokyo, on “Robotics and Automation in Micro & Nano-Scales.”
[ ICRA 2021 ]
This week's UPenn GRASP Lab Seminar comes from Rahul Mangharam, on “What can we learn from Autonomous Racing?”
[ UPenn ] Continue reading
#439826 Autonomous Racing Drones Dodge Through ...
It seems inevitable that sooner or later, the performance of autonomous drones will
surpass the performance of even the best human pilots. Usually things in robotics that seem inevitable happen later as opposed to sooner, but drone technology seems to be the exception to this. We've seen an astonishing amount of progress over the past few years, even to the extent of sophisticated autonomy making it into the hands of consumers at an affordable price.
The cutting edge of drone research right now is putting drones with relatively simple onboard sensing and computing in situations that require fast and highly aggressive maneuvers. In a paper
published yesterday in Science Robotics, roboticists from Davide Scaramuzza's Robotics and Perception Group at the University of Zurich along with partners at Intel demonstrate a small, self-contained, fully autonomous drone that can aggressively fly through complex environments at speeds of up to 40kph.
The trick here, to the extent that there's a trick, is that the drone performs a direct mapping of sensor input (from an Intel RealSense 435 stereo depth camera) to collision-free trajectories. Conventional obstacle avoidance involves first collecting sensor data; making a map based on that sensor data; and finally making a plan based on that map. This approach works perfectly fine as long as you're not concerned with getting all of that done quickly, but for a drone with limited onboard resources moving at high speed, it just takes too long. UZH's approach is instead to go straight from sensor input to trajectory output, which is much faster and allows the speed of the drone to increase substantially.
The convolutional network that performs this sensor-to-trajectory mapping was trained entirely in simulation, which is cheaper and easier but (I would have to guess) less fun than letting actual drones hammer themselves against obstacles over and over until they figure things out. A simulated “expert” drone pilot that has access to a 3D point cloud, perfect state estimation, and computation that's not constrained by real-time requirements trains its own end-to-end policy, which is of course not achievable in real life. But then, the simulated system that will be operating under real-life constraints just learns in simulation to match the expert as closely as possible, which is how you get that expert-level performance in a way that can be taken out of simulation and transferred to a real drone without any adaptation or fine-tuning.
The other big part of this is making that sim-to-real transition, which can be problematic because simulation doesn't always do a great job of simulating everything that happens in the world that can screw with a robot. But this method turns out to be very robust against motion blur, sensor noise, and other perception artifacts. The drone has successfully navigated through real world environments including snowy terrains, derailed trains, ruins, thick vegetation, and collapsed buildings.
“While humans require years to train, the AI, leveraging high-performance simulators, can reach comparable navigation abilities much faster, basically overnight.” -Antonio Loquercio, UZH
This is not to say that the performance here is flawless—the system still has trouble with very low illumination conditions (because the cameras simply can't see), as well as similar vision challenges like dust, fog, glare, and transparent or reflective surfaces. The training also didn't include dynamic obstacles, although the researchers tell us that moving things shouldn't be a problem even now as long as their speed relative to the drone is negligible. Many of these problems could potentially be mitigated by using
event cameras rather than traditional cameras, since faster sensors, especially ones tuned to detect motion, would be ideal for high speed drones.
The researchers tell us that their system does not (yet) surpass the performance of expert humans in these challenging environments:
Analyzing their performance indicates that humans have a very rich and detailed understanding of their surroundings and are capable of planning and executing plans that span far in the future (our approach plans only one second into the future). Both are capabilities that today's autonomous systems still lack. We see our work as a stepping stone towards faster autonomous flight that is enabled by directly predicting collision-free trajectories from high-dimensional (noisy) sensory input.
This is one of the things that is likely coming next, though—giving the drone the ability to learn and improve from real-world experience. Coupled with more capable sensors and always increasing computer power, pushing that flight envelope past 40 kph in complex environments seems like it's not just possible, but inevitable. Continue reading