Tag Archives: medicine
#439062 Xenobots 2.0: These Living Robots ...
The line between animals and machines was already getting blurry after a team of scientists and roboticists unveiled the first living robots last year. Now the same team has released version 2.0 of their so-called xenobots, and they’re faster, stronger, and more capable than ever.
In January 2020, researchers from Tufts University and the University of Vermont laid out a method for building tiny biological machines out of the eggs of the African claw frog Xenopus laevis. Dubbed xenobots after their animal forebear, they could move independently, push objects, and even team up to create swarms.
Remarkably, building them involved no genetic engineering. Instead, the team used an evolutionary algorithm running on a supercomputer to test out thousands of potential designs made up of different configurations of cells.
Once they’d found some promising candidates that could solve the tasks they were interested in, they used microsurgical tools to build real-world versions out of living cells. The most promising design was built by splicing heart muscle cells (which could contract to propel the xenobots), and skin cells (which provided a rigid support).
Impressive as that might sound, having to build each individual xenobot by hand is obviously tedious. But now the team has devised a new approach that works from the bottom up by getting the xenobots to self-assemble their bodies from single cells. Not only is the approach more scalable, the new xenobots are faster, live longer, and even have a rudimentary memory.
In a paper in Science Robotics, the researchers describe how they took stem cells from frog embryos and allowed them to grow into clumps of several thousand cells called spheroids. After a few days, the stem cells had turned into skin cells covered in small hair-like projections called cilia, which wriggle back and forth.
Normally, these structures are used to spread mucus around on the frog’s skin. But when divorced from their normal context they took on a function more similar to that seen in microorganisms, which use cilia to move about by acting like tiny paddles.
“We are witnessing the remarkable plasticity of cellular collectives, which build a rudimentary new ‘body’ that is quite distinct from their default—in this case, a frog—despite having a completely normal genome,” corresponding author Michael Levin from Tufts University said in a press release.
“We see that cells can re-purpose their genetically encoded hardware, like cilia, for new functions such as locomotion. It is amazing that cells can spontaneously take on new roles and create new body plans and behaviors without long periods of evolutionary selection for those features,” he said.
Not only were the new xenobots faster and longer-lived, they were also much better at tasks like working together as a swarm to gather piles of iron oxide particles. And while the form and function of the xenobots was achieved without any genetic engineering, in an extra experiment the team injected them with RNA that caused them to produce a fluorescent protein that changes color when exposed to a particular color of light.
This allowed the xenobots to record whether they had come into contact with a specific light source while traveling about. The researchers say this is a proof of principle that the xenobots can be imbued with a molecular memory, and future work could allow them to record multiple stimuli and potentially even react to them.
What exactly these xenobots could eventually be used for is still speculative, but they have features that make them a promising alternative to non-organic alternatives. For a start, robots made of stem cells are completely biodegradable and also have their own power source in the form of “yolk platelets” found in all amphibian embryos. They are also able to self-heal in as little as five minutes if cut, and can take advantage of cells’ ability to process all kinds of chemicals.
That suggests they could have applications in everything from therapeutics to environmental engineering. But the researchers also hope to use them to better understand the processes that allow individual cells to combine and work together to create a larger organism, and how these processes might be harnessed and guided for regenerative medicine.
As these animal-machine hybrids advance, they are sure to raise ethical concerns and question marks over the potential risks. But it looks like the future of robotics could be a lot more wet and squishy than we imagined.
Image Credit: Doug Blackiston/Tufts University Continue reading
#439042 How Scientists Used Ultrasound to Read ...
Thanks to neural implants, mind reading is no longer science fiction.
As I’m writing this sentence, a tiny chip with arrays of electrodes could sit on my brain, listening in on the crackling of my neurons firing as my hands dance across the keyboard. Sophisticated algorithms could then decode these electrical signals in real time. My brain’s inner language to plan and move my fingers could then be used to guide a robotic hand to do the same. Mind-to-machine control, voilà!
Yet as the name implies, even the most advanced neural implant has a problem: it’s an implant. For electrodes to reliably read the brain’s electrical chatter, they need to pierce through the its protective membrane and into brain tissue. Danger of infection aside, over time, damage accumulates around the electrodes, distorting their signals or even rendering them unusable.
Now, researchers from Caltech have paved a way to read the brain without any physical contact. Key to their device is a relatively new superstar in neuroscience: functional ultrasound, which uses sound waves to capture activity in the brain.
In monkeys, the technology could reliably predict their eye movement and hand gestures after just a single trial—without the usual lengthy training process needed to decode a movement. If adopted by humans, the new mind-reading tech represents a triple triumph: it requires minimal surgery and minimal learning, but yields maximal resolution for brain decoding. For people who are paralyzed, it could be a paradigm shift in how they control their prosthetics.
“We pushed the limits of ultrasound neuroimaging and were thrilled that it could predict movement,” said study author Dr. Sumner Norman.
To Dr. Krishna Shenoy at Stanford, who was not involved, the study will finally put ultrasound “on the map as a brain-machine interface technique. Adding to this toolkit is spectacular,” he said.
Breaking the Sound Barrier
Using sound to decode brain activity might seem preposterous, but ultrasound has had quite the run in medicine. You’ve probably heard of its most common use: taking photos of a fetus in pregnancy. The technique uses a transducer, which emits ultrasound pulses into the body and finds boundaries in tissue structure by analyzing the sound waves that bounce back.
Roughly a decade ago, neuroscientists realized they could adapt the tech for brain scanning. Rather than directly measuring the brain’s electrical chatter, it looks at a proxy—blood flow. When certain brain regions or circuits are active, the brain requires much more energy, which is provided by increased blood flow. In this way, functional ultrasound works similarly to functional MRI, but at a far higher resolution—roughly ten times, the authors said. Plus, people don’t have to lie very still in an expensive, claustrophobic magnet.
“A key question in this work was: If we have a technique like functional ultrasound that gives us high-resolution images of the brain’s blood flow dynamics in space and over time, is there enough information from that imaging to decode something useful about behavior?” said study author Dr. Mikhail Shapiro.
There’s plenty of reasons for doubt. As the new kid on the block, functional ultrasound has some known drawbacks. A major one: it gives a far less direct signal than electrodes. Previous studies show that, with multiple measurements, it can provide a rough picture of brain activity. But is that enough detail to guide a robotic prosthesis?
One-Trial Wonder
The new study put functional ultrasound to the ultimate test: could it reliably detect movement intention in monkeys? Because their brains are the most similar to ours, rhesus macaque monkeys are often the critical step before a brain-machine interface technology is adapted for humans.
The team first inserted small ultrasound transducers into the skulls of two rhesus monkeys. While it sounds intense, the surgery doesn’t penetrate the brain or its protective membrane; it’s only on the skull. Compared to electrodes, this means the brain itself isn’t physically harmed.
The device is linked to a computer, which controls the direction of sound waves and captures signals from the brain. For this study, the team aimed the pulses at the posterior parietal cortex, a part of the “motor” aspect of the brain, which plans movement. If right now you’re thinking about scrolling down this page, that’s the brain region already activated, before your fingers actually perform the movement.
Then came the tests. The first looked at eye movements—something pretty necessary before planning actual body movements without tripping all over the place. Here, the monkeys learned to focus on a central dot on a computer screen. A second dot, either left or right, then flashed. The monkeys’ task was to flicker their eyes to the most recent dot. It’s something that seems easy for us, but requires sophisticated brain computation.
The second task was more straightforward. Rather than just moving their eyes to the second target dot, the monkeys learned to grab and manipulate a joystick to move a cursor to that target.
Using brain imaging to decode the mind and control movement. Image Credit: S. Norman, Caltech
As the monkeys learned, so did the device. Ultrasound data capturing brain activity was fed into a sophisticated machine learning algorithm to guess the monkeys’ intentions. Here’s the kicker: once trained, using data from just a single trial, the algorithm was able to correctly predict the monkeys’ actual eye movement—whether left or right—with roughly 78 percent accuracy. The accuracy for correctly maneuvering the joystick was even higher, at nearly 90 percent.
That’s crazy accurate, and very much needed for a mind-controlled prosthetic. If you’re using a mind-controlled cursor or limb, the last thing you’d want is to have to imagine the movement multiple times before you actually click the web button, grab the door handle, or move your robotic leg.
Even more impressive is the resolution. Sound waves seem omnipresent, but with focused ultrasound, it’s possible to measure brain activity at a resolution of 100 microns—roughly 10 neurons in the brain.
A Cyborg Future?
Before you start worrying about scientists blasting your brain with sound waves to hack your mind, don’t worry. The new tech still requires skull surgery, meaning that a small chunk of skull needs to be removed. However, the brain itself is spared. This means that compared to electrodes, ultrasound could offer less damage and potentially a far longer mind reading than anything currently possible.
There are downsides. Focused ultrasound is far younger than any electrode-based neural implants, and can’t yet reliably decode 360-degree movement or fine finger movements. For now, the tech requires a wire to link the device to a computer, which is off-putting to many people and will prevent widespread adoption. Add to that the inherent downside of focused ultrasound, which lags behind electrical recordings by roughly two seconds.
All that aside, however, the tech is just tiptoeing into a future where minds and machines seamlessly connect. Ultrasound can penetrate the skull, though not yet at the resolution needed for imaging and decoding brain activity. The team is already working with human volunteers with traumatic brain injuries, who had to have a piece of their skulls removed, to see how well ultrasound works for reading their minds.
“What’s most exciting is that functional ultrasound is a young technique with huge potential. This is just our first step in bringing high performance, less invasive brain-machine interface to more people,” said Norman.
Image Credit: Free-Photos / Pixabay Continue reading