Tag Archives: live
#435793 Tiny Robots Carry Stem Cells Through a ...
Engineers have built microrobots to perform all sorts of tasks in the body, and can now add to that list another key skill: delivering stem cells. In a paper published today in Science Robotics, researchers describe propelling a magnetically-controlled, stem-cell-carrying bot through a live mouse.
Under a rotating magnetic field, the microrobots moved with rolling and corkscrew-style locomotion. The researchers, led by Hongsoo Choi and his team at the Daegu Gyeongbuk Institute of Science & Technology (DGIST), in South Korea, also demonstrated their bot’s moves in slices of mouse brain, in blood vessels isolated from rat brains, and in a multi-organ-on-a chip.
The invention provides an alternative way to deliver stem cells, which are increasingly important in medicine. Such cells can be coaxed into becoming nearly any kind of cell, making them great candidates for treating neurodegenerative disorders such as Alzheimer’s.
But delivering stem cells typically requires an injection with a needle, which lowers the survival rate of the stem cells, and limits their reach in the body. Microrobots, however, have the potential to deliver stem cells to precise, hard-to-reach areas, with less damage to surrounding tissue, and better survival rates, says Jin-young Kim, a principle investigator at DGIST-ETH Microrobotics Research Center, and an author on the paper.
The virtues of microrobots have inspired several research groups to propose and test different designs in simple conditions, such as microfluidic channels and other static environments. A group out of Hong Kong last year described a burr-shaped bot that carried cells through live, transparent zebrafish.
The new research presents a magnetically-actuated microrobot that successfully carried stem cells through a live mouse. In additional experiments, the cells, which had differentiated into brain cells such as astrocytes, oligodendrocytes, and neurons, transferred to microtissues on the multi-organ-on-a-chip. Taken together, the proof-of-concept experiments demonstrate the potential for microrobots to be used in human stem cell therapy, says Kim.
The team fabricated the robots with 3D laser lithography, and designed them in two shapes: spherical and helical. Using a rotating magnetic field, the scientists navigated the spherical-shaped bots with a rolling motion, and the helical bots with a corkscrew motion. These styles of locomotion proved more efficient than that from a simple pulling force, and were more suitable for use in biological fluids, the scientists reported.
The big challenge in navigating microbots in a live animal (or human body) is being able to see them in real time. Imaging with fMRI doesn’t work, because the magnetic fields interfere with the system. “To precisely control microbots in vivo, it is important to actually see them as they move,” the authors wrote in their paper.
That wasn’t possible during experiments in a live mouse, so the researchers had to check the location of the microrobots before and after the experiments using an optical tomography system called IVIS. They also had to resort to using a pulling force with a permanent magnet to navigate the microrobots inside the mouse, due to the limitations of the IVIS system.
Kim says he and his colleagues are developing imaging systems that will enable them to view in real time the locomotion of their microrobots in live animals. Continue reading
#435769 The Ultimate Optimization Problem: How ...
Lucas Joppa thinks big. Even while gazing down into his cup of tea in his modest office on Microsoft’s campus in Redmond, Washington, he seems to see the entire planet bobbing in there like a spherical tea bag.
As Microsoft’s first chief environmental officer, Joppa came up with the company’s AI for Earth program, a five-year effort that’s spending US $50 million on AI-powered solutions to global environmental challenges.
The program is not just about specific deliverables, though. It’s also about mindset, Joppa told IEEE Spectrum in an interview in July. “It’s a plea for people to think about the Earth in the same way they think about the technologies they’re developing,” he says. “You start with an objective. So what’s our objective function for Earth?” (In computer science, an objective function describes the parameter or parameters you are trying to maximize or minimize for optimal results.)
Photo: Microsoft
Lucas Joppa
AI for Earth launched in December 2017, and Joppa’s team has since given grants to more than 400 organizations around the world. In addition to receiving funding, some grantees get help from Microsoft’s data scientists and access to the company’s computing resources.
In a wide-ranging interview about the program, Joppa described his vision of the “ultimate optimization problem”—figuring out which parts of the planet should be used for farming, cities, wilderness reserves, energy production, and so on.
Every square meter of land and water on Earth has an infinite number of possible utility functions. It’s the job of Homo sapiens to describe our overall objective for the Earth. Then it’s the job of computers to produce optimization results that are aligned with the human-defined objective.
I don’t think we’re close at all to being able to do this. I think we’re closer from a technology perspective—being able to run the model—than we are from a social perspective—being able to make decisions about what the objective should be. What do we want to do with the Earth’s surface?
Such questions are increasingly urgent, as climate change has already begun reshaping our planet and our societies. Global sea and air surface temperatures have already risen by an average of 1 degree Celsius above preindustrial levels, according to the Intergovernmental Panel on Climate Change.
Today, people all around the world participated in a “climate strike,” with young people leading the charge and demanding a global transition to renewable energy. On Monday, world leaders will gather in New York for the United Nations Climate Action Summit, where they’re expected to present plans to limit warming to 1.5 degrees Celsius.
Joppa says such summit discussions should aim for a truly holistic solution.
We talk about how to solve climate change. There’s a higher-order question for society: What climate do we want? What output from nature do we want and desire? If we could agree on those things, we could put systems in place for optimizing our environment accordingly. Instead we have this scattered approach, where we try for local optimization. But the sum of local optimizations is never a global optimization.
There’s increasing interest in using artificial intelligence to tackle global environmental problems. New sensing technologies enable scientists to collect unprecedented amounts of data about the planet and its denizens, and AI tools are becoming vital for interpreting all that data.
The 2018 report “Harnessing AI for the Earth,” produced by the World Economic Forum and the consulting company PwC, discusses ways that AI can be used to address six of the world’s most pressing environmental challenges (climate change, biodiversity, and healthy oceans, water security, clean air, and disaster resilience).
Many of the proposed applications involve better monitoring of human and natural systems, as well as modeling applications that would enable better predictions and more efficient use of natural resources.
Joppa says that AI for Earth is taking a two-pronged approach, funding efforts to collect and interpret vast amounts of data alongside efforts that use that data to help humans make better decisions. And that’s where the global optimization engine would really come in handy.
For any location on earth, you should be able to go and ask: What’s there, how much is there, and how is it changing? And more importantly: What should be there?
On land, the data is really only interesting for the first few hundred feet. Whereas in the ocean, the depth dimension is really important.
We need a planet with sensors, with roving agents, with remote sensing. Otherwise our decisions aren’t going to be any good.
AI for Earth isn’t going to create such an online portal within five years, Joppa stresses. But he hopes the projects that he’s funding will contribute to making such a portal possible—eventually.
We’re asking ourselves: What are the fundamental missing layers in the tech stack that would allow people to build a global optimization engine? Some of them are clear, some are still opaque to me.
By the end of five years, I’d like to have identified these missing layers, and have at least one example of each of the components.
Some of the projects that AI for Earth has funded seem to fit that desire. Examples include SilviaTerra, which used satellite imagery and AI to create a map of the 92 billion trees in forested areas across the United States. There’s also OceanMind, a non-profit that detects illegal fishing and helps marine authorities enforce compliance. Platforms like Wildbook and iNaturalist enable citizen scientists to upload pictures of animals and plants, aiding conservation efforts and research on biodiversity. And FarmBeats aims to enable data-driven agriculture with low-cost sensors, drones, and cloud services.
It’s not impossible to imagine putting such services together into an optimization engine that knows everything about the land, the water, and the creatures who live on planet Earth. Then we’ll just have to tell that engine what we want to do about it.
Editor’s note: This story is published in cooperation with more than 250 media organizations and independent journalists that have focused their coverage on climate change ahead of the UN Climate Action Summit. IEEE Spectrum’s participation in the Covering Climate Now partnership builds on our past reporting about this global issue. Continue reading
#435748 Video Friday: This Robot Is Like a ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
RSS 2019 – June 22-26, 2019 – Freiburg, Germany
Hamlyn Symposium on Medical Robotics – June 23-26, 2019 – London, U.K.
ETH Robotics Summer School – June 27-1, 2019 – Zurich, Switzerland
MARSS 2019 – July 1-5, 2019 – Helsinki, Finland
ICRES 2019 – July 29-30, 2019 – London, U.K.
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.
It’s been a while since we last spoke to Joe Jones, the inventor of Roomba, about his solar-powered, weed-killing robot, called Tertill, which he was launching as a Kickstarter project. Tertill is now available for purchase (US $300) and is shipping right now.
[ Tertill ]
Usually, we don’t post videos that involve drone use that looks to be either illegal or unsafe. These flights over the protests in Hong Kong are almost certainly both. However, it’s also a unique perspective on the scale of these protests.
[ Team BlackSheep ]
ICYMI: iRobot announced this week that it has acquired Root Robotics.
[ iRobot ]
This Boston Dynamics parody video went viral this week.
The CGI is good but the gratuitous violence—even if it’s against a fake robot—is a bit too much?
This is still our favorite Boston Dynamics parody video:
[ Corridor ]
Biomedical Engineering Department Head Bin He and his team have developed the first-ever successful non-invasive mind-controlled robotic arm to continuously track a computer cursor.
[ CMU ]
Organic chemists, prepare to meet your replacement:
Automated chemical synthesis carries great promises of safety, efficiency and reproducibility for both research and industry laboratories. Current approaches are based on specifically-designed automation systems, which present two major drawbacks: (i) existing apparatus must be modified to be integrated into the automation systems; (ii) such systems are not flexible and would require substantial re-design to handle new reactions or procedures. In this paper, we propose a system based on a robot arm which, by mimicking the motions of human chemists, is able to perform complex chemical reactions without any modifications to the existing setup used by humans. The system is capable of precise liquid handling, mixing, filtering, and is flexible: new skills and procedures could be added with minimum effort. We show that the robot is able to perform a Michael reaction, reaching a yield of 34%, which is comparable to that obtained by a junior chemist (undergraduate student in Chemistry).
[ arXiv ] via [ NTU ]
So yeah, ICRA 2019 was huge and awesome. Here are some brief highlights.
[ Montreal Gazette ]
For about US $5, this drone will deliver raw meat and beer to you if you live on an uninhabited island in Tokyo Bay.
[ Nikkei ]
The Smart Microsystems Lab at Michigan State University has a new version of their Autonomous Surface Craft. It’s autonomous, open source, and awfully hard to sink.
[ SML ]
As drone shows go, this one is pretty good.
[ CCTV ]
Here’s a remote controlled robot shooting stuff with a very large gun.
[ HDT ]
Over a period of three quarters (September 2018 thru May 2019), we’ve had the opportunity to work with five graduating University of Denver students as they brought their idea for a Misty II arm extension to life.
[ Misty Robotics ]
If you wonder how it looks to inspect burners and superheaters of a boiler with an Elios 2, here you are! This inspection was performed by Svenska Elektrod in a peat-fired boiler for Vattenfall in Sweden. Enjoy!
[ Flyability ]
The newest Soft Robotics technology, mGrip mini fingers, made for tight spaces, small packaging, and delicate items, giving limitless opportunities for your applications.
[ Soft Robotics ]
What if legged robots were able to generate dynamic motions in real-time while interacting with a complex environment? Such technology would represent a significant step forward the deployment of legged systems in real world scenarios. This means being able to replace humans in the execution of dangerous tasks and to collaborate with them in industrial applications.
This workshop aims to bring together researchers from all the relevant communities in legged locomotion such as: numerical optimization, machine learning (ML), model predictive control (MPC) and computational geometry in order to chart the most promising methods to address the above-mentioned scientific challenges.
[ Num Opt Wkshp ]
Army researchers teamed with the U.S. Marine Corps to fly and test 3-D printed quadcopter prototypes a the Marine Corps Air Ground Combat Center in 29 Palms, California recently.
[ CCDC ARL ]
Lex Fridman’s Artificial Intelligence podcast featuring Rosalind Picard.
[ AI Podcast ]
In this week’s episode of Robots in Depth, per speaks with Christian Guttmann, executive director of the Nordic AI Artificial Intelligence Institute.
Christian Guttmann talks about AI and wanting to understand intelligence enough to recreate it. Christian has be focusing on AI in healthcare and has recently started to communicate the opportunities and challenges in artificial intelligence to the general public. This is something that the host Per Sjöborg is also very passionate about. We also get to hear about the Nordic AI institute and the work it does to inform all parts of society about AI.
[ Robots in Depth ] Continue reading