Tag Archives: possible
#437859 We Can Do Better Than Human-Like Hands ...
One strategy for designing robots that are capable in anthropomorphic environments is to make the robots themselves as anthropomorphic as possible. It makes sense—for example, there are stairs all over the place because humans have legs, and legs are good at stairs, so if we give robots legs like humans, they’ll be good at stairs too, right? We also see this tendency when it comes to robotic grippers, because robots need to grip things that have been optimized for human hands.
Despite some amazing robotic hands inspired by the biology of our own human hands, there are also opportunities for creativity in gripper designs that do things human hands are not physically capable of. At ICRA 2020, researchers from Stanford University presented a paper on the design of a robotic hand that has fingers made of actuated rollers, allowing it to manipulate objects in ways that would tie your fingers into knots.
While it’s got a couple fingers, this prototype “roller grasper” hand tosses anthropomorphic design out the window in favor of unique methods of in-hand manipulation. The roller grasper does share some features with other grippers designed for in-hand manipulation using active surfaces (like conveyor belts embedded in fingers), but what’s new and exciting here is that those articulated active roller fingertips (or whatever non-anthropomorphic name you want to give them) provide active surfaces that are steerable. This means that the hand can grasp objects and rotate them without having to resort to complex sequences of finger repositioning, which is how humans do it.
Photo: Stanford University
Things like picking something flat off of a table, always tricky for robotic hands (and sometimes for human hands as well), is a breeze thanks to the fingertip rollers.
Each of the hand’s fingers has three actuated degrees of freedom, which result in several different ways in which objects can be grasped and manipulated. Things like picking something flat off of a table, always tricky for robotic hands (and sometimes for human hands as well), is a breeze thanks to the fingertip rollers. The motion of an object in this gripper isn’t quite holonomic, meaning that it can’t arbitrarily reorient things without sometimes going through other intermediate steps. And it’s also not compliant in the way that many other grippers are, limiting some types of grasps. This particular design probably won’t replace every gripper out there, but it’s particularly skilled at some specific kinds of manipulations in a way that makes it unique.
We should be clear that it’s not the intent of this paper (or of this article!) to belittle five-fingered robotic hands—the point is that there are lots of things that you can do with totally different hand designs, and just because humans use one kind of hand doesn’t mean that robots need to do the same if they want to match (or exceed) some specific human capabilities. If we could make robotic hands with five fingers that had all of the actuation and sensing and control that our own hands do, that would be amazing, but it’s probably decades away. In the meantime, there are plenty of different designs to explore.
And speaking of exploring different designs, these same folks are already at work on version two of their hand, which replaces the fingertip rollers with fingertip balls:
For more on this new version of the hand (among other things), we spoke with lead author Shenli Yuan via email. And the ICRA page is here if you have questions of your own.
IEEE Spectrum: Human hands are often seen as the standard for manipulation. When adding degrees of freedom that human hands don’t have (as in your work) can make robotic hands more capable than ours in many ways, do you think we should still think of human hands as something to try and emulate?
Shenli Yuan: Yes, definitely. Not only because human hands have great manipulation capability, but because we’re constantly surrounded by objects that were designed and built specifically to be manipulated by the human hand. Anthropomorphic robot hands are still worth investigating, and still have a long way to go before they truly match the dexterity of a human hand. The design we came up with is an exploration of what unique capabilities may be achieved if we are not bound by the constraints of anthropomorphism, and what a biologically impossible mechanism may achieve in robotic manipulation. In addition, for lots of tasks, it isn’t necessarily optimal to try and emulate the human hand. Perhaps in 20 to 50 years when robot manipulators are much better, they won’t look like the human hand that much. The design constraints for robotics and biology have points in common (like mechanical wear, finite tendons stiffness) but also major differences (like continuous rotation for robots and less heat dissipation problems for humans).
“For lots of tasks, it isn’t necessarily optimal to try and emulate the human hand. Perhaps in 20 to 50 years when robot manipulators are much better, they won’t look like the human hand that much.”
—Shenli Yuan, Stanford University
What are some manipulation capabilities of human hands that are the most difficult to replicate with your system?
There are a few things that come to mind. It cannot perform a power grasp (using the whole hand for grasping as opposed to pinch grasp that uses only fingertips), which is something that can be easily done by human hands. It cannot move or rotate objects instantaneously in arbitrary directions or about arbitrary axes, though the human hand is somewhat limited in this respect as well. It also cannot perform gaiting. That being said, these limitations exist largely because this grasper only has 9 degrees of freedom, as opposed to the human hand which has more than 20. We don’t think of this grasper as a replacement for anthropomorphic hands, but rather as a way to provide unique capabilities without all of the complexity associated with a highly actuated, humanlike hand.
What’s the most surprising or impressive thing that your hand is able to do?
The most impressive feature is that it can rotate objects continuously, which is typically difficult or inefficient for humanlike robot hands. Something really surprising was that we put most of our energy into the design and analysis of the grasper, and the control strategy we implemented for demonstrations is very simple. This simple control strategy works surprisingly well with very little tuning or trial-and-error.
With this many degrees of freedom, how complicated is it to get the hand to do what you want it to do?
The number of degrees of freedom is actually not what makes controlling it difficult. Most of the difficulties we encountered were actually due to the rolling contact between the rollers and the object during manipulation. The rolling behavior can be viewed as constantly breaking and re-establishing contacts between the rollers and objects, this very dynamic behavior introduces uncertainties in controlling our grasper. Specifically, it was difficult estimating the velocity of each contact point with the object, which changes based on object and finger position, object shape (especially curvature), and slip/no slip.
What more can you tell us about Roller Grasper V2?
Roller Grasper V2 has spherical rollers, while the V1 has cylindrical rollers. We realized that cylindrical rollers are very good at manipulating objects when the rollers and the object form line contacts, but it can be unstable when the grasp geometry doesn’t allow for a line contact between each roller and the grasped object. Spherical rollers solve that problem by allowing predictable points of contact regardless of how a surface is oriented.
The parallelogram mechanism of Roller Grasper V1 makes the pivot axis offset a bit from the center of the roller, which made our control and analysis more challenging. The kinematics of the Roller Grasper V2 is simpler. The base joint intersects with the finger, which intersects with the pivot joint, and the pivot joint intersects with the roller joint. It’s symmetrical design and simpler kinematics make our control and analysis a lot more straightforward. Roller Grasper V2 also has a larger pivot range of 180 degrees, while V1 is limited to 90 degrees.
In terms of control, we implemented more sophisticated control strategies (including a hand-crafted control strategy and an imitation learning based strategy) for the grasper to perform autonomous in-hand manipulation.
“Design of a Roller-Based Dexterous Hand for Object Grasping and Within-Hand Manipulation,” by Shenli Yuan, Austin D. Epps, Jerome B. Nowak, and J. Kenneth Salisbury from Stanford University is being presented at ICRA 2020.
< Back to IEEE Journal Watch Continue reading →
#437851 Boston Dynamics’ Spot Robot Dog ...
Boston Dynamics has been fielding questions about when its robots are going to go on sale and how much they’ll cost for at least a dozen years now. I can say this with confidence, because that’s how long I’ve been a robotics journalist, and I’ve been pestering them about it the entire time. But it’s only relatively recently that the company started to make a concerted push away from developing robots exclusively for the likes of DARPA into platforms with more commercial potential, starting with a compact legged robot called Spot, first introduced in 2016.
Since then, we’ve been following closely as Spot has gone from a research platform to a product, and today, Boston Dynamics is announcing the final step in that process: commercial availability. You can now order a Spot Explorer Kit from the Boston Dynamics online store for US $74,500 (plus tax), shipping included, with delivery in 6 to 8 weeks. FINALLY!
Over the past 10 months or so, Boston Dynamics has leased Spot robots to carefully selected companies, research groups, and even a few individuals as part of their early adopter program—that’s where all of the clips in the video below came from. While there are over 100 Spots out in the world right now, getting one of them has required convincing Boston Dynamics up front that you knew more or less exactly what you wanted to do and how you wanted to do it. If you’re a big construction company or the Jet Propulsion Laboratory or Adam Savage, that’s all well and good, but for other folks who think that a Spot could be useful for them somehow and want to give it a shot, this new availability provides a fewer-strings attached opportunity to do some experimentation with the robot.
There’s a lot of cool stuff going on in that video, but we were told that the one thing that really stood out to the folks at Boston Dynamics was a 2-second clip that you can see on the left-hand side of the screen from 0:19 to 0:21. In it, Spot is somehow managing to walk across a spider web of rebar without getting tripped up, at faster than human speed. This isn’t something that Spot was specifically programmed to do, and in fact the Spot User Guide specifically identifies “rebar mesh” as an unsafe operating environment. But the robot just handles it, and that’s a big part of what makes Spot so useful—its ability to deal with (almost) whatever you can throw at it.
Before you get too excited, Boston Dynamics is fairly explicit that the current license for the robot is intended for commercial use, and the company specifically doesn’t want people to be just using it at home for fun. We know this because we asked (of course we asked), and they told us “we specifically don’t want people to just be using it at home for fun.” Drat. You can still buy one as an individual, but you have to promise that you’ll follow the terms of use and user guidelines, and it sounds like using a robot in your house might be the second-fastest way to invalidate your warranty:
SPOT IS AN AMAZING ROBOT, BUT IS NOT CERTIFIED SAFE FOR IN-HOME USE OR INTENDED FOR USE NEAR CHILDREN OR OTHERS WHO MAY NOT APPRECIATE THE HAZARDS ASSOCIATED WITH ITS OPERATION.
Not being able to get Spot to play with your kids may be disappointing, but for those of you with the sort of kids who are also students, the good news is that Boston Dynamics has carved out a niche for academic institutions, which can buy Spot at a discounted price. And if you want to buy a whole pack of Spots, there’s a bulk discount for Enterprise users as well.
What do you get for $74,500? All this!
Spot robot
Spot battery (2x)
Spot charger
Tablet controller and charger
Robot case for storage and transportation
FREE SHIPPING!
Photo: Boston Dynamics
The basic package includes the robot, two batteries, charger, a tablet controller, and a storage case.
You can view detailed specs here.
So is $75k a lot of money for a robot like Spot, or not all that much? We don’t have many useful points of comparison, partially because it’s not clear to what extent other pre-commercial quadrupedal robots (like ANYmal or Aliengo) share capabilities and features with Spot. For more perspective on Spot’s price tag, we spoke to Michael Perry, vice president of business development at Boston Dynamics.
IEEE Spectrum: Why is Spot so affordable?
Michael Perry: The main goal of selling the robot at this stage is to try to get it into the hands of as many application developers as possible, so that we can learn from the community what the biggest driver of value is for Spot. As a platform, unlocking the value of an ecosystem is our core focus right now.
Spectrum: Why is Spot so expensive?
Perry: Expensive is relative, but compared to the initial prototypes of Spot, we’ve been able to drop down the cost pretty significantly. One key thing has been designing it for robustness—we’ve put hundreds and hundreds of hours on the robot to make sure that it’s able to be successful when it falls, or when it has an electrostatic discharge. We’ve made sure that it’s able to perceive a wide variety of environments that are difficult for traditional vision-based sensors to handle. A lot of that engineering is baked into the core product so that you don’t have to worry about the mobility or robotic side of the equation, you can just focus on application development.
Photos: Boston Dynamics
Accessories for Spot include [clockwise from top left]: Spot GXP with additional ports for payload integration; Spot CAM with panorama camera and advanced comms; Spot CAM+ with pan-tilt-zoom camera for inspections; Spot EAP with lidar to enhance autonomy on large sites; Spot EAP+ with Spot CAM camera plus lidar; and Spot CORE for additional processing power.
The $75k that you’ll pay for the Spot Explorer Kit, it’s important to note, is just the base price for the robot. As with other things that fall into this price range (like a luxury car), there are all kinds of fun ways to drive that cost up with accessories, although for Spot, some of those accessories will be necessary for many (if not most) applications. For example, a couple of expansion ports to make it easier to install your own payloads on Spot will run you $1,275. An additional battery is $4,620. And if you want to really get some work done, the Enhanced Autonomy Package (with 360 cameras, lights, better comms, and a Velodyne VLP-16) will set you back an additional $34,570. If you were hoping for an arm, you’ll have to wait until the end of the year.
Each Spot also includes a year’s worth of software updates and a warranty, although the standard warranty just covers “defects related to materials and workmanship” not “I drove my robot off a cliff” or “I tried to take my robot swimming.” For that sort of thing (user error) to be covered, you’ll need to upgrade to the $12,000 Spot CARE premium service plan to cover your robot for a year as long as you don’t subject it to willful abuse, which both of those examples I just gave probably qualify as.
While we’re on the subject of robot abuse, Boston Dynamics has very sensibly devoted a substantial amount of the Spot User Guide to help new users understand how they should not be using their robot, in order to “lessen the risk of serious injury, death, or robot and other property damage.” According to the guide, some things that could cause Spot to fall include holes, cliffs, slippery surfaces (like ice and wet grass), and cords. Spot’s sensors also get confused by “transparent, mirrored, or very bright obstacles,” and the guide specifically says Spot “may crash into glass doors and windows.” Also this: “Spot cannot predict trajectories of moving objects. Do not operate Spot around moving objects such as vehicles, children, or pets.”
We should emphasize that this is all totally reasonable, and while there are certainly a lot of things to be aware of, it’s frankly astonishing that these are the only things that Boston Dynamics explicitly warns users against. Obviously, not every potentially unsafe situation or thing is described above, but the point is that Boston Dynamics is willing to say to new users, “here’s your robot, go do stuff with it” without feeling the need to hold their hand the entire time.
There’s one more thing to be aware of before you decide to buy a Spot, which is the following:
“All orders will be subject to Boston Dynamics’ Terms and Conditions of Sale which require the beneficial use of its robots.”
Specifically, this appears to mean that you aren’t allowed to (or supposed to) use the robot in a way that could hurt living things, or “as a weapon, or to enable any weapon.” The conditions of sale also prohibit using the robot for “any illegal or ultra-hazardous purpose,” and there’s some stuff in there about it not being cool to use Spot for “nuclear, chemical, or biological weapons proliferation, or development of missile technology,” which seems weirdly specific.
“Once you make a technology more broadly available, the story of it starts slipping out of your hands. Our hope is that ahead of time we’re able to clearly articulate the beneficial uses of the robot in environments where we think the robot has a high potential to reduce the risk to people, rather than potentially causing harm.”
—Michael Perry, Boston Dynamics
I’m very glad that Boston Dynamics is being so upfront about requiring that Spot is used beneficially. However, it does put the company in a somewhat challenging position now that these robots are being sold. Boston Dynamics can (and will) perform some amount of due-diligence before shipping a Spot, but ultimately, once the robots are in someone else’s hands, there’s only so much that BD can do.
Spectrum: Why is beneficial use important to Boston Dynamics?
Perry: One of the key things that we’ve highlighted many times in our license and terms of use is that we don’t want to see the robot being used in any way that inflicts physical harm on people or animals. There are philosophical reasons for that—I think all of us don’t want to see our technology used in a way that would hurt people. But also from a business perspective, robots are really terrible at conveying intention. In order for the robot to be helpful long-term, it has to be trusted as a piece of technology. So rather than looking at a robot and wondering, “is this something that could potentially hurt me,” we want people to think “this is a robot that’s here to help me.” To the extent that people associate Boston Dynamics with cutting edge robots, we think that this is an important stance for the rollout of our first commercial product. If we find out that somebody’s violated our terms of use, their warranty is invalidated, we won’t repair their product, and we have a licensing timeout that would prevent them from accessing their robot after that timeout has expired. It’s a remediation path, but we do think that it’s important to at least provide that as something that helps enforce our position on use of our technology.
It’s very important to keep all of this in context: Spot is a tool. It’s got some autonomy and the appearance of agency, but it’s still just doing what people tell it to do, even if those things might be unsafe. If you read through the user guide, it’s clear how much of an effort Boston Dynamics is making to try to convey the importance of safety to Spot users—and ultimately, barring some unforeseen and catastrophic software or hardware issues, safety is about the users, rather than Boston Dynamics or Spot itself. I bring this up because as we start seeing more and more Spots doing things without Boston Dynamics watching over them quite so closely, accidents are likely inevitable. Spot might step on someone’s foot. It might knock someone over. If Spot was perfectly safe, it wouldn’t be useful, and we have to acknowledge that its impressive capabilities come with some risks, too.
Photo: Boston Dynamics
Each Spot includes a year’s worth of software updates and a warranty, although the standard warranty just covers “defects related to materials and workmanship” not “I drove my robot off a cliff.”
Now that Spot is on the market for real, we’re excited to see who steps up and orders one. Depending on who the potential customer is, Spot could either seem like an impossibly sophisticated piece of technology that they’d never be able to use, or a magical way of solving all of their problems overnight. In reality, it’s of course neither of those things. For the former (folks with an idea but without a lot of robotics knowledge or experience), Spot does a lot out of the box, but BD is happy to talk with people and facilitate connections with partners who might be able to integrate specific software and hardware to get Spot to do a unique task. And for the latter (who may also be folks with an idea but without a lot of robotics knowledge or experience), BD’s Perry offers a reminder Spot is not Rosie the Robot, and would be equally happy to talk about what the technology is actually capable of doing.
Looking forward a bit, we asked Perry whether Spot’s capabilities mean that customers are starting to think beyond using robots to simply replace humans, and are instead looking at them as a way of enabling a completely different way of getting things done.
Spectrum: Do customers interested in Spot tend to think of it as a way of replacing humans at a specific task, or as a system that can do things that humans aren’t able to do?
Perry: There are what I imagine as three levels of people understanding the robot applications. Right now, we’re at level one, where you take a person out of this dangerous, dull job, and put a robot in. That’s the entry point. The second level is, using the robot, can we increase the production of that task? For example, take site documentation on a construction site—right now, people do 360 image capture of a site maybe once a week, and they might do a laser scan of the site once per project. At the second level, the question is, what if you were able to get that data collection every day, or multiple times a day? What kinds of benefits would that add to your process? To continue the construction example, the third level would be, how could we completely redesign this space now that we know that this type of automation is available? To take one example, there are some things that we cannot physically build because it’s too unsafe for people to be a part of that process, but if you were to apply robotics to that process, then you could potentially open up a huge envelope of design that has been inaccessible to people.
To order a Spot of your very own, visit shop.bostondynamics.com.
A version of this post appears in the August 2020 print issue as “$74,500 Will Fetch You a Spot.” Continue reading →
#437845 Video Friday: Harmonic Bionics ...
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICRA 2020 – May 31-August 31, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.
Designed to protect employees and passengers from both harmful pathogens and cleaning agents, Breezy One can quickly, safely and effectively decontaminate spaces over 100,000 square feet in 1.5 hours with a patented, environmentally safe disinfectant. Breezy One was co-developed with the City of Albuquerque’s Aviation Department, where it autonomously sanitizes the Sunport’s facilities every night in the ongoing fight against COVID-19.
[ Fetch Robotics ]
Harmonic Bionics is redefining upper extremity neurorehabilitation with intelligent robotic technology designed to maximize patient recovery. Harmony SHR, our flagship product, works with a patient’s scapulohumeral rhythm (SHR) to enable natural, comprehensive therapy for both arms. When combined with Harmony’s Weight Support mode, this unique shoulder design may allow for earlier initiation of post-stroke therapy as Harmony can support a partial dislocation or subluxation of the shoulder prior to initiating traditional therapy exercises.
Harmony's Preprogrammed Exercises promotes functional treatment through patient-specific movements that can enable an increased number of repetitions per session without placing a larger physical burden on therapists or their resources. As the only rehabilitation exoskeleton with Bilateral Sync Therapy (BST), Harmony enables intent-based therapy by registering healthy arm movements and synchronizing that motion onto the stroke-affected side to help reestablish neural pathways.
[ Harmonic Bionics ]
Thanks Mok!
Some impressive work here from IHMC and IIT getting Atlas to take steps upward in a way that’s much more human-like than robot-like, which ends up reducing maximum torque requirements by 20 percent.
[ Paper ]
GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.
[ GITAI ]
Malloy Aeronautics, which now makes drones rather than hoverbikes, has been working with the Royal Navy in New Zealand to figure out how to get cargo drones to land on ships.
The challenge was to test autonomous landing of heavy lift UAVs on a moving ship, however, due to the Covid19 lockdown no ship trails were possible. The moving deck was simulated by driving a vehicle and trailer across an airfield while carrying out multiple landing and take-offs. The autonomous system partner was Planck Aerosystems and autolanding was triggered by a camera on the UAV reading a QR code on the trailer.
[ Malloy Aeronautics ]
Thanks Paul!
Tertill looks to be relentlessly effective.
[ Franklin Robotics ]
A Swedish company, TikiSafety has experienced a record amount of orders for their protective masks. At ABB, we are grateful for the opportunity to help Tiki Safety to speed up their manufacturing process from 6 minutes to 40 seconds.
[ Tiki Safety ]
The Korea Atomic Energy Research Institute is not messing around with ARMstrong, their robot for nuclear and radiation emergency response.
[ KAERI ]
OMOY is a robot that communicates with its users via internal weight shifting.
[ Paper ]
Now this, this is some weird stuff.
[ Segway ]
CaTARo is a Care Training Assistant Robot from the AIS Lab at Ritsumeikan University.
[ AIS Lab ]
Originally launched in 2015 to assist workers in lightweight assembly tasks, ABB’s collaborative YuMi robot has gone on to blaze a trail in a raft of diverse applications and industries, opening new opportunities and helping to fire people’s imaginations about what can be achieved with robotic automation.
[ ABB ]
This music video features COMAN+, from the Humanoids and Human Centered Mechatronics Lab at IIT, doing what you’d call dance moves if you dance like I do.
[ Alex Braga ] via [ IIT ]
The NVIDIA Isaac Software Development Kit (SDK) enables accelerated AI robot development workflows. Stacked with new tools and application support, Isaac SDK 2020.1 is an end-to-end solution supporting each step of robot fleet deployment, from design collaboration and training to the ongoing maintenance of AI applications.
[ NVIDIA ]
Robot Spy Komodo Dragon and Spy Pig film “a tender moment” between Komodo dragons but will they both survive the encounter?
[ BBC ] via [ Laughing Squid ]
This is part one of a mostly excellent five-part documentary about ROS produced by Red Hat. I say mostly only because they put ME in it for some reason, but fortunately, they talked with many of the core team that developed ROS back at Willow Garage back in the day, and it’s definitely worth watching.
[ Red Hat Open Source Stories ]
It’s been a while, but here’s an update on SRI’s Abacus Drive, from Alexander Kernbaum.
[ SRI ]
This Robots For Infectious Diseases interview features IEEE Fellow Antonio Bicchi, professor of robotics at the University of Pisa, talking about how Italy has been using technology to help manage COVID-19.
[ R4ID ]
Two more interviews this week of celebrity roboticists from MassRobotics: Helen Greiner and Marc Raibert. I’d introduce them, but you know who they are already!
[ MassRobotics ] Continue reading →
#437796 AI Seeks ET: Machine Learning Powers ...
Can artificial intelligence help the search for life elsewhere in the solar system? NASA thinks the answer may be “yes”—and not just on Mars either.
A pilot AI system is now being tested for use on the ExoMars mission that is currently slated to launch in the summer or fall of 2022. The machine-learning algorithms being developed will help science teams decide how to test Martian soil samples to return only the most meaningful data.
For ExoMars, the AI system will only be used back on earth to analyze data gather by the ExoMars rover. But if the system proves to be as useful to the rovers as now suspected, a NASA mission to Saturn’s moon Titan (now scheduled for 2026 launch) could automate the scientific sleuthing process in the field. This mission will rely on the Dragonfly octocopter drone to fly from surface location to surface location through Titan’s dense atmosphere and drill for signs of life there.
The hunt for microbial life in another world’s soil, either as fossilized remnants or as present-day samples, is very challenging, says Eric Lyness, software lead of the NASA Goddard Planetary Environments Lab in Greenbelt, Md. There is of course no precedent to draw upon, because no one has yet succeeded in astrobiology’s holy grail quest.
But that doesn’t mean AI can’t provide substantial assistance. Lyness explained that for the past few years he’d been puzzling over how to automate portions of an exploratory mission’s geochemical investigation, wherever in the solar system the scientific craft may be.
Last year he decided to try machine learning. “So we got some interns,” he said. “People right out of college or in college, who have been studying machine learning. … And they did some amazing stuff. It turned into much more than we expected.” Lyness and his collaborators presented their scientific analysis algorithm at a geochemistry conference last month.
Illustration: ESA
The ExoMars rover, named Rosalind Franklin, will be the first that can drill down to 2-meter depths, where living soil bacteria could possibly be found.
ExoMars’s rover—named Rosalind Franklin, after one of the co-discoverers of DNA—will be the first that can drill down to 2-meter depths, beyond where solar UV light might penetrate and kill any life forms. In other words, ExoMars will be the first Martian craft with the ability to reach soil depths where living soil bacteria could possibly be found.
“We could potentially find forms of life, microbes or other things like that,” Lyness said. However, he quickly added, very little conclusive evidence today exists to suggest that there’s present-day (microbial) life on Mars. (NASA’s Curiosity rover has sent back some inexplicable observations of both methane and molecular oxygen in the Martian atmosphere that could conceivably be a sign of microbial life forms, though non-biological processes could explain these anomalies too.)
Less controversially, the Rosalind Franklin rover’s drill could also turn up fossilized evidence of life in the Martian soil from earlier epochs when Mars was more hospitable.
NASA’s contribution to the joint Russian/European Space Agency ExoMars project is an instrument called a mass spectrometer that will be used to analyze soil samples from the drill cores. Here, Lyness said, is where AI could really provide a helping hand.
Because the Dragonfly drone and possibly a future mission to Jupiter’s moon Europa would be operating in hostile environments with less opportunity for data transmission to Earth, automating a craft’s astrobiological exploration would be practically a requirement
The spectrometer, which studies the mass distribution of ions in a sample of material, works by blasting the drilled soil sample with a laser and then mapping out the atomic masses of the various molecules and portions of molecules that the laser has liberated. The problem is any given mass spectrum could originate from any number of source compounds, minerals and components. Which always makes analyzing a mass spectrum a gigantic puzzle.
Lyness said his group is studying the mineral montmorillonite, a commonplace component of the Martian soil, to see the many ways it might reveal itself in a mass spectrum. Then his team sneaks in an organic compound with the montmorillonite sample to see how that changes the mass spectrometer output.
“It could take a long time to really break down a spectrum and understand why you’re seeing peaks at certain [masses] in the spectrum,” he said. “So anything you can do to point scientists into a direction that says, ‘Don’t worry, I know it’s not this kind of thing or that kind of thing,’ they can more quickly identify what’s in there.”
Lyness said the ExoMars mission will provide a fertile training ground for his team’s as-yet-unnamed AI algorithm. (He said he’s open to suggestions—though, please, no spoof Boaty McBoatface submissions need apply.)
Because the Dragonfly drone and possibly a future astrobiology mission to Jupiter’s moon Europa would be operating in much more hostile environments with much less opportunity for data transmission back and forth to Earth, automating a craft’s astrobiological exploration would be practically a requirement.
All of which points to a future in mid-2030s in which a nuclear-powered octocopter on a moon of Saturn flies from location to location to drill for evidence of life on this tantalizingly bio-possible world. And machine learning will help power the science.
“We should be researching how to make the science instruments smarter,” Lyness said. “If you can make it smarter at the source, especially for planetary exploration, it has huge payoffs.” Continue reading →
#437778 A Bug-Sized Camera for Bug-Sized Robots ...
As if it’s not hard enough to make very small mobile robots, once you’ve gotten the power and autonomy all figured out (good luck with that), your robot isn’t going to be all that useful unless it can carry some payload. And the payload that everybody wants robots to carry is a camera, which is of course a relatively big, heavy, power hungry payload. Great, just great.
This whole thing is frustrating because tiny, lightweight, power efficient vision systems are all around us. Literally, all around us right this second, stuffed into the heads of insects. We can’t make anything quite that brilliant (yet), but roboticists from the University of Washington, in Seattle, have gotten us a bit closer, with the smallest wireless, steerable video camera we’ve ever seen—small enough to fit on the back of a microbot, or even a live bug.
To make a camera this small, the UW researchers, led by Shyam Gollakota, a professor of computer science and engineering, had to start nearly from scratch, primarily because existing systems aren’t nearly so constrained by power availability. Even things like swallowable pill cameras require batteries that weigh more than a gram, but only power the camera for under half an hour. With a focus on small size and efficiency, they started with an off-the-shelf ultra low-power image sensor that’s 2.3 mm wide and weighs 6.7 mg. They stuck on a Bluetooth 5.0 chip (3 mm wide, 6.8 mg), and had a fun time connecting those two things together without any intermediary hardware to broadcast the camera output. A functional wireless camera also requires a lens (20 mg) and an antenna, which is just 5 mm of wire. An accelerometer is useful so that insect motion can be used to trigger the camera, minimizing the redundant frames that you’d get from a robot or an insect taking a nap.
Photo: University of Washington
The microcamera developed by the UW researchers can stream monochrome video at up to 5 frames per second to a cellphone 120 meters away.
The last bit to make up this system is a mechanically steerable “head,” weighing 35 mg and bringing the total weight of the wireless camera system to 84 mg. If the look of the little piezoelectric actuator seems familiar, you have very good eyes because it’s tiny, and also, it’s the same kind of piezoelectric actuator that the folks at UW use to power their itty bitty flying robots. It’s got a 60-degree panning range, but also requires a 96 mg boost converter to function, which is a huge investment in size and weight just to be able to point the camera a little bit. But overall, the researchers say that this pays off, because not having to turn the entire robot (or insect) when you want to look around reduces the energy consumption of the system as a whole by a factor of up to 84 (!).
Photo: University of Washington
Insects are very mobile platforms for outdoor use, but they’re also not easy to steer, so the researchers also built a little insect-scale robot that they could remotely control while watching the camera feed. As it turns out, this seems to be the smallest, power-autonomous terrestrial robot with a camera ever made.
This efficiency means that the wireless camera system can stream video frames (160×120 pixels monochrome) to a cell phone up to 120 meters away for up to 6 hours when powered by a 0.5-g, 10-mAh battery. A live, first-bug view can be streamed at up to 5 frames per second. The system was successfully tested on a pair of darkling beetles that were allowed to roam freely outdoors, and the researchers noted that they could also mount it on spiders or moths, or anything else that could handle the payload. (The researchers removed the electronics from the insects after the experiments and observed no noticeable adverse effects on their behavior.)
The researchers are already thinking about what it might take to put a wireless camera system on something that flies, and it’s not going to be easy—a bumblebee can only carry between 100 and 200 mg. The power system is the primary limitation here, but it might be possible to use a solar cell to cut down on battery requirements. And the camera itself could be scaled down as well, by using a completely custom sensor and a different type of lens. The other thing to consider is that with a long-range wireless link and a vision system, it’s possible to add sophisticated vision-based autonomy to tiny robots by doing the computation remotely. So, next time you see something scuttling across the ground, give it another look, because it might be looking right back at you.
“Wireless steerable vision for live insects and insect-scale robots,” by Vikram Iyer, Ali Najafi, Johannes James, Sawyer Fuller, and Shyamnath Gollakota from the University of Washington, is published in Science Robotics. Continue reading →