Tag Archives: can

#438006 Smellicopter Drone Uses Live Moth ...

Research into robotic sensing has, understandably I guess, been very human-centric. Most of us navigate and experience the world visually and in 3D, so robots tend to get covered with things like cameras and lidar. Touch is important to us, as is sound, so robots are getting pretty good with understanding tactile and auditory information, too. Smell, though? In most cases, smell doesn’t convey nearly as much information for us, so while it hasn’t exactly been ignored in robotics, it certainly isn’t the sensing modality of choice in most cases.

Part of the problem with smell sensing is that we just don’t have a good way of doing it, from a technical perspective. This has been a challenge for a long time, and it’s why we either bribe or trick animals like dogs, rats, vultures, and other animals to be our sensing systems for airborne chemicals. If only they’d do exactly what we wanted them to do all the time, this would be fine, but they don’t, so it’s not.

Until we get better at making chemical sensors, leveraging biology is the best we can do, and what would be ideal would be some sort of robot-animal hybrid cyborg thing. We’ve seen some attempts at remote controlled insects, but as it turns out, you can simplify things if you don’t use the entire insect, but instead just find a way to use its sensing system. Enter the Smellicopter.

There’s honestly not too much to say about the drone itself. It’s an open-source drone project called Crazyflie 2.0, with some additional off the shelf sensors for obstacle avoidance and stabilization. The interesting bits are a couple of passive fins that keep the drone pointed into the wind, and then the sensor, called an electroantennogram.

Image: UW

The drone’s sensor, called an electroantennogram, consists of a “single excised antenna” from a Manduca sexta hawkmoth and a custom signal processing circuit.

To make one of these sensors, you just, uh, “harvest” an antenna from a live hawkmoth. Obligingly, the moth antenna is hollow, meaning that you can stick electrodes up it. Whenever the olfactory neurons in the antenna (which is still technically alive even though it’s not attached to the moth anymore) encounter an odor that they’re looking for, they produce an electrical signal that the electrodes pick up. Plug the other ends of the electrodes into a voltage amplifier and filter, run it through an analog to digital converter, and you’ve got a chemical sensor that weighs just 1.5 gram and consumes only 2.7 mW of power. It’s significantly more sensitive than a conventional metal-oxide odor sensor, in a much smaller and more efficient form factor, making it ideal for drones.

To localize an odor, the Smellicopter uses a simple bioinspired approach called crosswind casting, which involves moving laterally left and right and then forward when an odor is detected. Here’s how it works:

The vehicle takes off to a height of 40 cm and then hovers for ten seconds to allow it time to orient upwind. The smellicopter starts casting left and right crosswind. When a volatile chemical is detected, the smellicopter will surge 25 cm upwind, and then resume casting. As long as the wind direction is fairly consistent, this strategy will bring the insect or robot increasingly closer to a singular source with each surge.

Since odors are airborne, they need a bit of a breeze to spread very far, and the Smellicopter won’t be able to detect them unless it’s downwind of the source. But, that’s just how odors work— even if you’re right next to the source, if the wind is blowing from you towards the source rather than the other way around, you might not catch a whiff of it.

Whenever the olfactory neurons in the antenna encounter an odor that they’re looking for, they produce an electrical signal that the electrodes pick up

There are a few other constraints to keep in mind with this sensor as well. First, rather than detecting something useful (like explosives), it’s going to detect the smells of pretty flowers, because moths like pretty flowers. Second, the antenna will literally go dead on you within a couple hours, since it only functions while its tissues are alive and metaphorically kicking. Interestingly, it may be possible to use CRISPR-based genetic modification to breed moths with antennae that do respond to useful smells, which would be a neat trick, and we asked the researchers—Melanie Anderson, a doctoral student of mechanical engineering at the University of Washington, in Seattle; Thomas Daniel, a UW professor of biology; and Sawyer Fuller, a UW assistant professor of mechanical engineering—about this, along with some other burning questions, via email.

IEEE Spectrum, asking the important questions first: So who came up with “Smellicopter”?

Melanie Anderson: Tom Daniel coined the term “Smellicopter”. Another runner up was “OdorRotor”!

In general, how much better are moths at odor localization than robots?

Melanie Anderson: Moths are excellent at odor detection and odor localization and need to be in order to find mates and food. Their antennae are much more sensitive and specialized than any portable man-made odor sensor. We can't ask the moths how exactly they search for odors so well, but being able to have the odor sensitivity of a moth on a flying platform is a big step in that direction.

Tom Daniel: Our best estimate is that they outperform robotic sensing by at least three orders of magnitude.

How does the localization behavior of the Smellicopter compare to that of a real moth?

Anderson: The cast-and-surge odor search strategy is a simplified version of what we believe the moth (and many other odor searching animals) are doing. It is a reactive strategy that relies on the knowledge that if you detect odor, you can assume that the source is somewhere up-wind of you. When you detect odor, you simply move upwind, and when you lose the odor signal you cast in a cross-wind direction until you regain the signal.

Can you elaborate on the potential for CRISPR to be able to engineer moths for the detection of specific chemicals?

Anderson: CRISPR is already currently being used to modify the odor detection pathways in moth species. It is one of our future efforts to specifically use this to make the antennae sensitive to other chemicals of interest, such as the chemical scent of explosives.

Sawyer Fuller: We think that one of the strengths of using a moth's antenna, in addition to its speed, is that it may provide a path to both high chemical specificity as well as high sensitivity. By expressing a preponderance of only one or a few chemosensors, we are anticipating that a moth antenna will give a strong response only to that chemical. There are several efforts underway in other research groups to make such specific, sensitive chemical detectors. Chemical sensing is an area where biology exceeds man-made systems in terms of efficiency, small size, and sensitivity. So that's why we think that the approach of trying to leverage biological machinery that already exists has some merit.

You mention that the antennae lifespan can be extended for a few days with ice- how feasible do you think this technology is outside of a research context?

Anderson: The antennae can be stored in tiny vials in a standard refrigerator or just with an ice pack to extend their life to about a week. Additionally, the process for attaching the antenna to the electrical circuit is a teachable skill. It is definitely feasible outside of a research context.

Considering the trajectory that sensor development is on, how long do you think that this biological sensor system will outperform conventional alternatives?

Anderson: It's hard to speak toward what will happen in the future, but currently, the moth antenna still stands out among any commercially-available portable sensors.

There have been some experiments with cybernetic insects; what are the advantages and disadvantages of your approach, as opposed to (say) putting some sort of tracking system on a live moth?

Daniel: I was part of a cyber insect team a number of years ago. The challenge of such research is that the animal has natural reactions to attempts to steer or control it.

Anderson: While moths are better at odor tracking than robots currently, the advantage of the drone platform is that we have control over it. We can tell it to constrain the search to a certain area, and return after it finishes searching.

What can you tell us about the health, happiness, and overall wellfare of the moths in your experiments?

Anderson: The moths are cold anesthetized before the antennae are removed. They are then frozen so that they can be used for teaching purposes or in other research efforts.

What are you working on next?

Daniel: The four big efforts are (1) CRISPR modification, (2) experiments aimed at improving the longevity of the antennal preparation, (3) improved measurements of antennal electrical responses to odors combined with machine learning to see if we can classify different odors, and (4) flight in outdoor environments.

Fuller: The moth's antenna sensor gives us a new ability to sense with a much shorter latency than was previously possible with similarly-sized sensors (e.g. semiconductor sensors). What exactly a robot agent should do to best take advantage of this is an open question. In particular, I think the speed may help it to zero in on plume sources in complex environments much more quickly. Think of places like indoor settings with flow down hallways that splits out at doorways, and in industrial settings festooned with pipes and equipment. We know that it is possible to search out and find odors in such scenarios, as anybody who has had to contend with an outbreak of fruit flies can attest. It is also known that these animals respond very quickly to sudden changes in odor that is present in such turbulent, patchy plumes. Since it is hard to reduce such plumes to a simple model, we think that machine learning may provide insights into how to best take advantage of the improved temporal plume information we now have available.

Tom Daniel also points out that the relative simplicity of this project (now that the UW researchers have it all figured out, that is) means that even high school students could potentially get involved in it, even if it’s on a ground robot rather than a drone. All the details are in the paper that was just published in Bioinspiration & Biomimetics. Continue reading

Posted in Human Robots

#438001 How an Israeli Startup Is Using AI to ...

The first baby conceived using in-vitro fertilization (IVF) was born in the UK in 1978. Over 40 years later, the technique has become commonplace, but its success rate is still fairly low at around 22 to 30 percent. A female-founded Israeli startup called Embryonics is setting out to change this by using artificial intelligence to screen embryos.

IVF consists of fertilizing a woman’s egg with her partner’s or a donor’s sperm outside of her body, creating an embryo that’s then implanted in the uterus. It’s not an easy process in any sense of the word—physically, emotionally, or financially. Insurance rarely covers IVF, and the costs run anywhere from $12,000 to $25,000 per cycle (a cycle takes about a month and includes stimulating a woman’s ovaries to produce eggs, extracting the eggs, inseminating them outside the body, and implanting an embryo).

Women have to give themselves daily hormone shots to stimulate egg production, and these can cause uncomfortable side effects. After so much stress and expense, it’s disheartening to think that the odds of a successful pregnancy are, at best, one in three.

A crucial factor in whether or not an IVF cycle works—that is, whether the embryo implants in the uterus and begins to develop into a healthy fetus—is the quality of the embryo. Doctors examine embryos through a microscope to determine how many cells they contain and whether they appear healthy, and choose the one that looks most viable.

But the human eye can only see so much, even with the help of a microscope; despite embryologists’ efforts to select the “best” embryo, success rates are still relatively low. “Many decisions are based on gut feeling or personal experience,” said Embryonics founder and CEO Yael Gold-Zamir. “Even if you go to the same IVF center, two experts can give you different opinions on the same embryo.”

This is where Embryonics’ technology comes in. They used 8,789 time-lapse videos of developing embryos to train an algorithm that predicts the likelihood of successful embryo implantation. A little less than half of the embryos from the dataset were graded by embryologists, and implantation data was integrated when it was available (as a binary “successful” or “failed” metric).

The algorithm uses geometric deep learning, a technique that takes a traditional convolutional neural network—which filters input data to create maps of its features, and is most commonly used for image recognition—and applies it to more complex data like 3D objects and graphs. Within days after fertilization, the embryo is still at the blastocyst stage, essentially a microscopic clump of just 200-300 cells; the algorithm uses this deep learning technique to spot and identify patterns in embryo development that human embryologists either wouldn’t see at all, or would require massive collation of data to validate.

On top of the embryo videos, Embryonics’ team incorporated patient data and environmental data from the lab into its algorithm, with encouraging results: the company reports that using its algorithm resulted in a 12 percent increase in positive predictive value (identifying embryos that would lead to implantation and healthy pregnancy) and a 29 percent increase in negative predictive value (identifying embyros that would not result in successful pregnancy) when compared to an external panel of embryologists.

TechCrunch reported last week that in a pilot of 11 women who used Embryonics’ algorithm to select their embryos, 6 are enjoying successful pregnancies, while 5 are still awaiting results.

Embryonics wasn’t the first group to think of using AI to screen embryos; a similar algorithm developed in 2019 by researchers at Weill Cornell Medicine was able to classify the quality of a set of embryo images with 97 percent accuracy. But Embryonics will be one of the first to bring this sort of technology to market. The company is waiting to receive approval from European regulatory bodies to be able to sell the software to fertility clinics in Europe.

Its timing is ripe: as more and more women delay having kids due to lifestyle and career-related factors, demand for IVF is growing, and will likely accelerate in coming years.

The company ultimately hopes to bring its product to the US, as well as to expand its work to include using data to improve hormonal stimulation.

Image Credit: Gerd Altmann from Pixabay Continue reading

Posted in Human Robots

#437992 This Week’s Awesome Tech Stories From ...

ARTIFICIAL INTELLIGENCE
This Chinese Lab Is Aiming for Big AI Breakthroughs
Will Knight | Wired
“China produces as many artificial intelligence researchers as the US, but it lags in key fields like machine learning. The government hopes to make up ground. …It set AI researchers the goal of making ‘fundamental breakthroughs by 2025’ and called for the country to be ‘the world’s primary innovation center by 2030.’ BAAI opened a year later, in Zhongguancun, a neighborhood of Beijing designed to replicate US innovation hubs such as Boston and Silicon Valley.”

ENVIRONMENT
What Elon Musk’s $100 Million Carbon Capture Prize Could Mean
James Temple | MIT Technology Review
“[Elon Musk] announced on Twitter that he plans to give away $100 million of [his $180 billion net worth] as a prize for the ‘best carbon capture technology.’ …Another $100 million could certainly help whatever venture, or ventures, clinch Musk’s prize. But it’s a tiny fraction of his wealth and will also only go so far. …Money aside, however, one thing Musk has a particular knack for is generating attention. And this is a space in need of it.”

HEALTH
Synthetic Cornea Helped a Legally Blind Man Regain His Sight
Steve Dent | Engadget
“While the implant doesn’t contain any electronics, it could help more people than any robotic eye. ‘After years of hard work, seeing a colleague implant the CorNeat KPro with ease and witnessing a fellow human being regain his sight the following day was electrifying and emotionally moving, there were a lot of tears in the room,’ said CorNeat Vision co-founder Dr. Gilad Litvin.”

BIOTECH
MIT Develops Method for Lab-Grown Plants That May Eventually Lead to Alternatives to Forestry and Farming
Darrell Etherington | TechCrunch
“If the work of these researchers can eventually be used to create a way to produce lab-grown wood for use in construction and fabrication in a way that’s scalable and efficient, then there’s tremendous potential in terms of reducing the impact on forestry globally. Eventually, the team even theorizes you could coax the growth of plant-based materials into specific target shapes, so you could also do some of the manufacturing in the lab, by growing a wood table directly for instance.”

AUTOMATION
FAA Approves First Fully Automated Commercial Drone Flights
Andy Pasztor and Katy Stech Ferek | The Wall Street Journal
“US aviation regulators have approved the first fully automated commercial drone flights, granting a small Massachusetts-based company permission to operate drones without hands-on piloting or direct observation by human controllers or observers. …The company’s Scout drones operate under predetermined flight programs and use acoustic technology to detect and avoid drones, birds, and other obstacles.”

SPACE
China’s Surging Private Space Industry Is Out to Challenge the US
Neel V. Patel | MIT Technology Review
“[The Ceres-1] was a commercial rocket—only the second from a Chinese company ever to go into space. And the launch happened less than three years after the company was founded. The achievement is a milestone for China’s fledgling—but rapidly growing—private space industry, an increasingly critical part of the country’s quest to dethrone the US as the world’s preeminent space power.”

CRYPTOCURRENCY
Janet Yellen Will Consider Limiting Use of Cryptocurrency
Timothy B. Lee | Ars Technica
“Cryptocurrencies could come under renewed regulatory scrutiny over the next four years if Janet Yellen, Joe Biden’s pick to lead the Treasury Department, gets her way. During Yellen’s Tuesday confirmation hearing before the Senate Finance Committee, Sen. Maggie Hassan (D-N.H.) asked Yellen about the use of cryptocurrency by terrorists and other criminals. ‘Cryptocurrencies are a particular concern,’ Yellen responded. ‘I think many are used—at least in a transactions sense—mainly for illicit financing.’i”

SCIENCE
Secret Ingredient Found to Power Supernovas
Thomas Lewton | Quanta
“…Only in the last few years, with the growth of supercomputers, have theorists had enough computing power to model massive stars with the complexity needed to achieve explosions. …These new simulations are giving researchers a better understanding of exactly how supernovas have shaped the universe we see today.”

Image Credit: Ricardo Gomez Angel / Unsplash Continue reading

Posted in Human Robots

#437990 Video Friday: Record-Breaking Drone Show ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online]
RoboSoft 2021 – April 12-16, 2021 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.

A new parent STAR robot is presented. The parent robot has a tail on which the child robot can climb. By collaborating together, the two robots can reach locations that neither can reach on its own.

The parent robot can also supply the child robot with energy by recharging its batteries. The parent STAR can dispatch and recuperate the child STAR automatically (when aligned). The robots are fitted with sensors and controllers and have automatic capabilities but make no decisions on their own.

[ Bio-Inspired and Medical Robotics Lab ]

How TRI trains its robots.

[ TRI ]

The only thing more satisfying than one SCARA robot is two SCARA robots working together.

[ Fanuc ]

I'm not sure that this is strictly robotics, but it's so cool that it's worth a watch anyway.

[ Shinoda & Makino Lab ]

Flying insects heavily rely on optical flow for visual navigation and flight control. Roboticists have endowed small flying robots with optical flow control as well, since it requires just a tiny vision sensor. However, when using optical flow, the robots run into two problems that insects appear to have overcome. Firstly, since optical flow only provides mixed information on distances and velocities, using it for control leads to oscillations when getting closer to obstacles. Secondly, since optical flow provides very little information on obstacles in the direction of motion, it is hardest to detect obstacles that the robot is actually going to collide with! We propose a solution to these problems by means of a learning process.

[ Nature ]

A new Guinness World Record was set on Friday in north China for the longest animation performed by 600 unmanned aerial vehicles (UAVs).

[ Xinhua ]

Translucency is prevalent in everyday scenes. As such, perception of transparent objects is essential for robots to perform manipulation. In this work, we propose LIT, a two-stage method for transparent object pose estimation using light-field sensing and photorealistic rendering.

[ University of Michigan ] via [ Fetch Robotics ]

This paper reports the technological progress and performance of team “CERBERUS” after participating in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge.

And here's a video report on the SubT Urban Beta Course performance:

[ CERBERUS ]

Congrats to Energy Robotics on 2 million euros in seed funding!

[ Energy Robotics ]

Thanks Stefan!

In just 2 minutes, watch HEBI robotics spending 23 minutes assembling a robot arm.

HEBI Robotics is hosting a webinar called 'Redefining the Robotic Arm' next week, which you can check out at the link below.

[ HEBI Robotics ]

Thanks Hardik!

Achieving versatile robot locomotion requires motor skills which can adapt to previously unseen situations. We propose a Multi-Expert Learning Architecture (MELA) that learns to generate adaptive skills from a group of representative expert skills. During training, MELA is first initialised by a distinct set of pre-trained experts, each in a separate deep neural network (DNN). Then by learning the combination of these DNNs using a Gating Neural Network (GNN), MELA can acquire more specialised experts and transitional skills across various locomotion modes.

[ Paper ]

Since the dawn of history, advances in science and technology have pursued “power” and “accuracy.” Initially, “hardness” in machines and materials was sought for reliable operations. In our area of Science of Soft Robots, we have combined emerging academic fields aimed at “softness” to increase the exposure and collaboration of researchers in different fields.

[ Science of Soft Robots ]

A team from the Laboratory of Robotics and IoT for Smart Precision Agriculture and Forestry at INESC TEC – Technology and Science are creating a ROS stack solution using Husky UGV for precision field crop agriculture.

[ Clearpath Robotics ]

Associate Professor Christopher J. Hasson in the Department of Physical Therapy is the director Neuromotor Systems Laboratory at Northeastern University. There he is working with a robotic arm to provide enhanced assistance to physical therapy patients, while maintaining the intimate therapist and patient relationship.

[ Northeastern ]

Mobile Robotic telePresence (MRP) systems aim to support enhanced collaboration between remote and local members of a given setting. But MRP systems also put the remote user in positions where they frequently rely on the help of local partners. Getting or ‘recruiting’ such help can be done with various verbal and embodied actions ranging in explicitness. In this paper, we look at how such recruitment occurs in video data drawn from an experiment where pairs of participants (one local, one remote) performed a timed searching task.

[ Microsoft Research ]

A presentation [from Team COSTAR] for the American Geophysical Union annual fall meeting on the application of robotic multi-sensor 3D Mapping for scientific exploration of caves. Lidar-based 3D maps are combined with visual/thermal/spectral/gas sensors to provide rich 3D context for scientific measurements map.

[ COSTAR ] Continue reading

Posted in Human Robots

#437984 WSR: A new Wi-Fi-based system for ...

Researchers at Harvard University have recently devised a system based on Wi-Fi sensing that could enhance the collaboration between robots operating in unmapped environments. This system, presented in a paper pre-published on arXiv, can essentially emulate antenna arrays in the air as a robot moves freely in a 2-D or 3-D environment. Continue reading

Posted in Human Robots