Tag Archives: rather

#437733 Video Friday: MIT Media Lab Developing ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AWS Cloud Robotics Summit – August 18-19, 2020 – [Online Conference]
CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

Very impressive local obstacle avoidance at a fairly high speed on a small drone, both indoors and outdoors.

[ FAST Lab ]

Matt Carney writes:

My PhD at MIT Media Lab has been the design and build of a next generation powered prosthesis. The bionic ankle, named TF8, was designed to provide biologically equivalent power and range of motion for plantarflexion-dorsiflexion. This video shows the process of going from a blank sheet of paper to people walking on it. Shown are three different people wearing the robot. About a dozen people have since been able to test the hardware.

[ MIT ]

Thanks Matt!

Exciting changes are coming to the iRobot® Home App. Get ready for new personalized experiences, improved features, and an easy-to-use interface. The update is rolling out over the next few weeks!

[ iRobot ]

MOFLIN is an AI Pet created from a totally new concept. It possesses emotional capabilities that evolve like living animals. With its warm soft fur, cute sounds, and adorable movement, you’d want to love it forever. We took a nature inspired approach and developed a unique algorithm that allows MOFLIN to learn and grow by constantly using its interactions to determine patterns and evaluate its surroundings from its sensors. MOFLIN will choose from an infinite number of mobile and sound pattern combinations to respond and express its feelings. To put it in simple terms, it’s like you’re interacting with a living pet.

You lost me at “it’s like you’re interacting with a living pet.”

[ Kickstarter ] via [ Gizmodo ]

This video is only robotics-adjacent, but it has applications for robotic insects. With a high-speed tracking system, we can now follow insects as they jump and fly, and watch how clumsy (but effective) they are at it.

[ Paper ]

Thanks Sawyer!

Suzumori Endo Lab, Tokyo Tech has developed self-excited pneumatic actuators that can be integrally molded by a 3D printer. These actuators use the “automatic flow path switching mechanism” we have devised.

[ Suzimori Endo Lab ]

Quadrupeds are getting so much better at deciding where to step rather than just stepping where they like and trying not to fall over.

[ RSL ]

Omnidirectional micro aerial vehicles are a growing field of research, with demonstrated advantages for aerial interaction and uninhibited observation. While systems with complete pose omnidirectionality and high hover efficiency have been developed independently, a robust system that combines the two has not been demonstrated to date. This paper presents the design and optimal control of a novel omnidirectional vehicle that can exert a wrench in any orientation while maintaining efficient flight configurations.

[ ASL ]

The latest in smooth humanoid walking from Dr. Guero.

[ YouTube ]

Will robots replace humans one day? When it comes to space exploration, robots are our precursors, gathering data to prepare humans for deep space. ESA robotics engineer Martin Azkarate discusses some of the upcoming missions involving robots and the unique science they will perform in this episode of Meet the Experts.

[ ESA ]

The Multi-robot Systems Group at FEE-CTU in Prague is working on an autonomous drone that detects fires and the shoots an extinguisher capsule at them.

[ MRS ]

This experiment with HEAP (Hydraulic Excavator for Autonomous Purposes) demonstrates our latest research in on-site and mobile digital fabrication with found materials. The embankment prototype in natural granular material was achieved using state of the art design and construction processes in mapping, modelling, planning and control. The entire process of building the embankment was fully autonomous. An operator was only present in the cabin for safety purposes.

[ RSL ]

The Simulation, Systems Optimization and Robotics Group (SIM) of Technische Universität Darmstadt’s Department of Computer Science conducts research on cooperating autonomous mobile robots, biologically inspired robots and numerical optimization and control methods.

[ SIM ]

Starting January 1, 2021, your drone platform of choice may be severely limited by the European Union’s new drone regulations. In this short video, senseFly’s Brock Ryder explains what that means for drone programs and operators and where senseFly drones fit in the EU’s new regulatory framework.

[ SenseFly ]

Nearly every company across every industry is looking for new ways to minimize human contact, cut costs and address the labor crunch in repetitive and dangerous jobs. WSJ explores why many are looking to robots as the solution for all three.

[ WSJ ]

You’ll need to prepare yourself emotionally for this video on “Examining Users’ Attitude Towards Robot Punishment.”

[ ACM ]

In this episode of the AI Podcast, Lex interviews Russ Tedrake (MIT and TRI) about biped locomotion, the DRC, home robots, and more.

[ AI Podcast ] Continue reading

Posted in Human Robots

#437723 Minuscule RoBeetle Turns Liquid Methanol ...

It’s no secret that one of the most significant constraints on robots is power. Most robots need lots of it, and it has to come from somewhere, with that somewhere usually being a battery because there simply aren’t many other good options. Batteries, however, are famous for having poor energy density, and the smaller your robot is, the more of a problem this becomes. And the issue with batteries goes beyond the battery itself, but also carries over into all the other components that it takes to turn the stored energy into useful work, which again is a particular problem for small-scale robots.

In a paper published this week in Science Robotics, researchers from the University of Southern California, in Los Angeles, demonstrate RoBeetle, an 88-milligram four legged robot that runs entirely on methanol, a power-dense liquid fuel. Without any electronics at all, it uses an exceptionally clever bit of mechanical autonomy to convert methanol vapor directly into forward motion, one millimeter-long step at a time.

It’s not entirely clear from the video how the robot actually works, so let’s go through how it’s put together, and then look at the actuation cycle.

Image: Science Robotics

RoBeetle (A) uses a methanol-based actuation mechanism (B). The robot’s body (C) includes the fuel tank subassembly (D), a tank lid, transmission, and sliding shutter (E), bottom side of the sliding shutter (F), nickel-titanium-platinum composite wire and leaf spring (G), and front legs and hind legs with bioinspired backward-oriented claws (H).

The body of RoBeetle is a boxy fuel tank that you can fill with methanol by poking a syringe through a fuel inlet hole. It’s a quadruped, more or less, with fixed hind legs and two front legs attached to a single transmission that moves them both at once in a sort of rocking forward and up followed by backward and down motion. The transmission is hooked up to a leaf spring that’s tensioned to always pull the legs backward, such that when the robot isn’t being actuated, the spring and transmission keep its front legs more or less vertical and allow the robot to stand. Those horns are primarily there to hold the leaf spring in place, but they’ve got little hooks that can carry stuff, too.

The actuator itself is a nickel-titanium (NiTi) shape-memory alloy (SMA), which is just a wire that gets longer when it heats up and then shrinks back down when it cools. SMAs are fairly common and used for all kinds of things, but what makes this particular SMA a little different is that it’s been messily coated with platinum. The “messily” part is important for a reason that we’ll get to in just a second.

The way that the sliding vent is attached to the transmission is the really clever bit about this robot, because it means that the motion of the wire itself is used to modulate the flow of fuel through a purely mechanical system. Essentially, it’s an actuator and a sensor at the same time.

One end of the SMA wire is attached to the middle of the leaf spring, while the other end runs above the back of the robot where it’s stapled to an anchor block on the robot’s rear end. With the SMA wire hooked up but not actuated (i.e., cold rather than warm), it’s short enough that the leaf spring gets pulled back, rocking the legs forward and up. The last component is embedded in the robot’s back, right along the spine and directly underneath the SMA actuator. It’s a sliding vent attached to the transmission, so that the vent is open when the SMA wire is cold and the leaf spring is pulled back, and closed when the SMA wire is warm and the leaf spring is relaxed. The way that the sliding vent is attached to the transmission is the really clever bit about this robot, because it means that the motion of the wire itself is used to modulate the flow of fuel through a purely mechanical system. Essentially, it’s an actuator and a sensor at the same time.

The actuation cycle that causes the robot to walk begins with a full fuel tank and a cold SMA wire. There’s tension on the leaf spring, pulling the transmission back and rocking the legs forward and upward. The transmission also pulls the sliding vent into the open position, allowing methanol vapor to escape up out of the fuel tank and into the air, where it wafts past the SMA wire that runs directly above the vent.

The platinum facilitates a reaction of the methanol (CH3OH) with oxygen in the air (combustion, although not the dramatic flaming and explosive kind) to generate a couple of water molecules and some carbon dioxide plus a bunch of heat, and this is where the messy platinum coating is important, because messy means lots of surface area for the platinum to interact with as much methanol as possible. In just a second or two the temperature of the SMA wire skyrockets from 50 to 100 ºC and it expands, allowing the leaf spring about 0.1 mm of slack. As the leaf spring relaxes, the transmission moves the legs backwards and downwards, and the robot pulls itself forward about 1.2 mm. At the same time, the transmission is closing off the sliding vent, cutting off the supply of methanol vapor. Without the vapor reacting with the platinum and generating heat, in about a second and a half, the SMA wire cools down. As it does, it shrinks, pulling on the leaf spring and starting the cycle over again. Top speed is 0.76 mm/s (0.05 body-lengths per second).

An interesting environmental effect is that the speed of the robot can be enhanced by a gentle breeze. This is because air moving over the SMA wire cools it down a bit faster while also blowing away any residual methanol from around the vents, shutting down the reaction more completely. RoBeetle can carry more than its own body weight in fuel, and it takes approximately 155 minutes for a full tank of methanol to completely evaporate. It’s worth noting that despite the very high energy density of methanol, this is actually a stupendously inefficient way of powering a robot, with an estimated end-to-end efficiency of just 0.48 percent. Not 48 percent, mind you, but 0.48 percent, while in general, powering SMAs with electricity is much more efficient.

However, you have to look at the entire system that would be necessary to deliver that electricity, and for a robot as small as RoBeetle, the researchers say that it’s basically impossible. The lightest commercially available battery and power supply that would deliver enough juice to heat up an SMA actuator weighs about 800 mg, nearly 10 times the total weight of RoBeetle itself. From that perspective, RoBeetle’s efficiency is actually pretty good.

Image: A. Kitterman/Science Robotics; adapted from R.L.T./MIT

Comparison of various untethered microrobots and bioinspired soft robots that use different power and actuation strategies.

There are some other downsides to RoBeetle we should mention—it can only move forwards, not backwards, and it can’t steer. Its speed isn’t adjustable, and once it starts walking, it’ll walk until it either breaks or runs out of fuel. The researchers have some ideas about the speed, at least, pointing out that increasing the speed of fuel delivery by using pressurized liquid fuels like butane or propane would increase the actuator output frequency. And the frequency, amplitude, and efficiency of the SMAs themselves can be massively increased “by arranging multiple fiber-like thin artificial muscles in hierarchical configurations similar to those observed in sarcomere-based animal muscle,” making RoBeetle even more beetle-like.

As for sensing, RoBeetle’s 230-mg payload is enough to carry passive sensors, but getting those sensors to usefully interact with the robot itself to enable any kind of autonomy remains a challenge. Mechanically intelligence is certainly possible, though, and we can imagine RoBeetle adopting some of the same sorts of systems that have been proposed for the clockwork rover that JPL wants to use for Venus exploration. The researchers also mention how RoBeetle could potentially serve as a model for microbots capable of aerial locomotion, which is something we’d very much like to see.

“An 88-milligram insect-scale autonomous crawling robot driven by a catalytic artificial muscle,” by Xiufeng Yang, Longlong Chang, and Néstor O. Pérez-Arancibia from University of Southern California, in Los Angeles, was published in Science Robotics. Continue reading

Posted in Human Robots

#437716 Robotic Tank Is Designed to Crawl ...

Let’s talk about bowels! Most of us have them, most of us use them a lot, and like anything that gets used a lot, they eventually need to get checked out to help make sure that everything will keep working the way it should for as long as you need it to. Generally, this means a colonoscopy, and while there are other ways of investigating what’s going on in your gut, a camera on a flexible tube is still “the gold-standard method of diagnosis and intervention,” according to some robotics researchers who want to change that up a bit.

The University of Colorado’s Advanced Medical Technologies Lab has been working on a tank robot called Endoculus that’s able to actively drive itself through your intestines, rather than being shoved. The good news is that it’s very small, and the bad news is that it’s probably not as small as you’d like it to be.

The reason why a robot like Endoculus is necessary (or at least a good idea) is that trying to stuff a semi-rigid endoscopy tube into the semi-floppy tube that is your intestine doesn’t always go smoothly. Sometimes, the tip of the endoscopy tube can get stuck, and as more tube is fed in, it causes the intestine to distend, which best case is painful and worst case can cause serious internal injuries. One way of solving this is with swallowable camera pills, but those don’t help you with tasks like taking tissue samples. A self-propelled system like Endoculus could reduce risk while also making the procedure faster and cheaper.

Image: Advanced Medical Technologies Lab/University of Colorado

The researchers say that while the width of Endoculus is larger than a traditional endoscope, the device would require “minimal distention during use” and would “not cause pain or harm to the patient.” Future versions of the robot, they add, will “yield a smaller footprint.”

Endoculus gets around with four sets of treads, angled to provide better traction against the curved walls of your gut. The treads are micropillared, or covered with small nubs, which helps them deal with all your “slippery colon mucosa.” Designing the robot was particularly tricky because of the severe constraints on the overall size of the device, which is just 3 centimeters wide and 2.3 cm high. In order to cram the two motors required for full control, they had to be arranged parallel to the treads, resulting in a fairly complex system of 3D-printed worm gears. And to make the robot actually useful, it includes a camera, LED lights, tubes for injecting air and water, and a tool port that can accommodate endoscopy instruments like forceps and snares to retrieve tissue samples.

So far, Endoculus has spent some time inside of a live pig, although it wasn’t able to get that far since pig intestines are smaller than human intestines, and because apparently the pig intestine is spiraled somehow. The pig (and the robot) both came out fine. A (presumably different) pig then provided some intestine that was expanded to human-intestine size, inside of which Endoculus did much better, and was able to zip along at up to 40 millimeters per second without causing any damage. Personally, I’m not sure I’d want a robot to explore my intestine at a speed much higher than that.

The next step with Endoculus is to add some autonomy, which means figuring out how to do localization and mapping using the robot’s onboard camera and IMU. And then of course someone has to be the first human to experience Endoculus directly, which I’d totally volunteer for except the research team is in Colorado and I’m not. Sorry!

“Novel Optimization-Based Design and Surgical Evaluation of a Treaded Robotic Capsule Colonoscope,” by Gregory A. Formosa, J. Micah Prendergast, Steven A. Edmundowicz, and Mark E. Rentschler, from the University of Colorado, was presented at ICRA 2020.

< Back to IEEE Journal Watch Continue reading

Posted in Human Robots

#437709 iRobot Announces Major Software Update, ...

Since the release of the very first Roomba in 2002, iRobot’s long-term goal has been to deliver cleaner floors in a way that’s effortless and invisible. Which sounds pretty great, right? And arguably, iRobot has managed to do exactly this, with its most recent generation of robot vacuums that make their own maps and empty their own dustbins. For those of us who trust our robots, this is awesome, but iRobot has gradually been realizing that many Roomba users either don’t want this level of autonomy, or aren’t ready for it.

Today, iRobot is announcing a major new update to its app that represents a significant shift of its overall approach to home robot autonomy. Humans are being brought back into the loop through software that tries to learn when, where, and how you clean so that your Roomba can adapt itself to your life rather than the other way around.

To understand why this is such a shift for iRobot, let’s take a very brief look back at how the Roomba interface has evolved over the last couple of decades. The first generation of Roomba had three buttons on it that allowed (or required) the user to select whether the room being vacuumed was small or medium or large in size. iRobot ditched that system one generation later, replacing the room size buttons with one single “clean” button. Programmable scheduling meant that users no longer needed to push any buttons at all, and with Roombas able to find their way back to their docking stations, all you needed to do was empty the dustbin. And with the most recent few generations (the S and i series), the dustbin emptying is also done for you, reducing direct interaction with the robot to once a month or less.

Image: iRobot

iRobot CEO Colin Angle believes that working toward more intelligent human-robot collaboration is “the brave new frontier” of AI. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” he says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The point that the top-end Roombas are at now reflects a goal that iRobot has been working toward since 2002: With autonomy, scheduling, and the clean base to empty the bin, you can set up your Roomba to vacuum when you’re not home, giving you cleaner floors every single day without you even being aware that the Roomba is hard at work while you’re out. It’s not just hands-off, it’s brain-off. No noise, no fuss, just things being cleaner thanks to the efforts of a robot that does its best to be invisible to you. Personally, I’ve been completely sold on this idea for home robots, and iRobot CEO Colin Angle was as well.

“I probably told you that the perfect Roomba is the Roomba that you never see, you never touch, you just come home everyday and it’s done the right thing,” Angle told us. “But customers don’t want that—they want to be able to control what the robot does. We started to hear this a couple years ago, and it took a while before it sunk in, but it made sense.”

How? Angle compares it to having a human come into your house to clean, but you weren’t allowed to tell them where or when to do their job. Maybe after a while, you’ll build up the amount of trust necessary for that to work, but in the short term, it would likely be frustrating. And people get frustrated with their Roombas for this reason. “The desire to have more control over what the robot does kept coming up, and for me, it required a pretty big shift in my view of what intelligence we were trying to build. Autonomy is not intelligence. We need to do something more.”

That something more, Angle says, is a partnership as opposed to autonomy. It’s an acknowledgement that not everyone has the same level of trust in robots as the people who build them. It’s an understanding that people want to have a feeling of control over their homes, that they have set up the way that they want, and that they’ve been cleaning the way that they want, and a robot shouldn’t just come in and do its own thing.

This change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware.

“Until the robot proves that it knows enough about your home and about the way that you want your home cleaned,” Angle says, “you can’t move forward.” He adds that this is one of those things that seem obvious in retrospect, but even if they’d wanted to address the issue before, they didn’t have the technology to solve the problem. Now they do. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” Angle says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

Where to Clean
Knowing where to clean depends on your Roomba having a detailed and accurate map of its environment. For several generations now, Roombas have been using visual mapping and localization (VSLAM) to build persistent maps of your home. These maps have been used to tell the Roomba to clean in specific rooms, but that’s about it. With the new update, Roombas with cameras will be able to recognize some objects and features in your home, including chairs, tables, couches, and even countertops. The robots will use these features to identify where messes tend to happen so that they can focus on those areas—like around the dining room table or along the front of the couch.

We should take a minute here to clarify how the Roomba is using its camera. The original (primary?) purpose of the camera was for VSLAM, where the robot would take photos of your home, downsample them into QR-code-like patterns of light and dark, and then use those (with the assistance of other sensors) to navigate. Now the camera is also being used to take pictures of other stuff around your house to make that map more useful.

Photo: iRobot

The robots will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table.

This is done through machine learning using a library of images of common household objects from a floor perspective that iRobot had to develop from scratch. Angle clarified for us that this is all done via a neural net that runs on the robot, and that “no recognizable images are ever stored on the robot or kept, and no images ever leave the robot.” Worst case, if all the data iRobot has about your home gets somehow stolen, the hacker would only know that (for example) your dining room has a table in it and the approximate size and location of that table, because the map iRobot has of your place only stores symbolic representations rather than images.

Another useful new feature is intended to help manage the “evil Roomba places” (as Angle puts it) that every home has that cause Roombas to get stuck. If the place is evil enough that Roomba has to call you for help because it gave up completely, Roomba will now remember, and suggest that either you make some changes or that it stops cleaning there, which seems reasonable.

When to Clean
It turns out that the primary cause of mission failure for Roombas is not that they get stuck or that they run out of battery—it’s user cancellation, usually because the robot is getting in the way or being noisy when you don’t want it to be. “If you kill a Roomba’s job because it annoys you,” points out Angle, “how is that robot being a good partner? I think it’s an epic fail.” Of course, it’s not the robot’s fault, because Roombas only clean when we tell them to, which Angle says is part of the problem. “People actually aren’t very good at making their own schedules—they tend to oversimplify, and not think through what their schedules are actually about, which leads to lots of [figurative] Roomba death.”

To help you figure out when the robot should actually be cleaning, the new app will look for patterns in when you ask the robot to clean, and then recommend a schedule based on those patterns. That might mean the robot cleans different areas at different times every day of the week. The app will also make scheduling recommendations that are event-based as well, integrated with other smart home devices. Would you prefer the Roomba to clean every time you leave the house? The app can integrate with your security system (or garage door, or any number of other things) and take care of that for you.

More generally, Roomba will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table. The app will also, to some extent, pay attention to the environment and season. It might suggest increasing your vacuuming frequency if pollen counts are especially high, or if it’s pet shedding season and you have a dog. Unfortunately, Roomba isn’t (yet?) capable of recognizing dogs on its own, so the app has to cheat a little bit by asking you some basic questions.

A Smarter App

Image: iRobot

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

The app update, which should be available starting today, is free. The scheduling and recommendations will work on every Roomba model, although for object recognition and anything related to mapping, you’ll need one of the more recent and fancier models with a camera. Future app updates will happen on a more aggressive schedule. Major app releases should happen every six months, with incremental updates happening even more frequently than that.

Angle also told us that overall, this change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware. “It’s not like we’re done doing hardware,” Angle assured us. “But we do think about hardware differently. We view our robots as platforms that have longer life cycles, and each platform will be able to support multiple generations of software. We’ve kind of decoupled robot intelligence from hardware, and that’s a change.”

Angle believes that working toward more intelligent collaboration between humans and robots is “the brave new frontier of artificial intelligence. I expect it to be the frontier for a reasonable amount of time to come,” he adds. “We have a lot of work to do to create the type of easy-to-use experience that consumer robots need.” Continue reading

Posted in Human Robots

#437697 These Underwater Drones Use Water ...

Yi Chao likes to describe himself as an “armchair oceanographer” because he got incredibly seasick the one time he spent a week aboard a ship. So it’s maybe not surprising that the former NASA scientist has a vision for promoting remote study of the ocean on a grand scale by enabling underwater drones to recharge on the go using his company’s energy-harvesting technology.

Many of the robotic gliders and floating sensor stations currently monitoring the world’s oceans are effectively treated as disposable devices because the research community has a limited number of both ships and funding to retrieve drones after they’ve accomplished their mission of beaming data back home. That’s not only a waste of money, but may also contribute to a growing assortment of abandoned lithium-ion batteries polluting the ocean with their leaking toxic materials—a decidedly unsustainable approach to studying the secrets of the underwater world.

“Our goal is to deploy our energy harvesting system to use renewable energy to power those robots,” says Chao, president and CEO of the startup Seatrec. “We're going to save one battery at a time, so hopefully we're going to not to dispose more toxic batteries in the ocean.”

Chao’s California-based startup claims that its SL1 Thermal Energy Harvesting System can already help save researchers money equivalent to an order of magnitude reduction in the cost of using robotic probes for oceanographic data collection. The startup is working on adapting its system to work with autonomous underwater gliders. And it has partnered with defense giant Northrop Grumman to develop an underwater recharging station for oceangoing drones that incorporates Northrop Grumman’s self-insulating electrical connector capable of operating while the powered electrical contacts are submerged.

Seatrec’s energy-harvesting system works by taking advantage of how certain substances transition from solid-to-liquid phase and liquid-to-gas phase when they heat up. The company’s technology harnesses the pressure changes that result from such phase changes in order to generate electricity.

Image: Seatrec

To make the phase changes happen, Seatrec’s solution taps the temperature differences between warmer water at the ocean surface and colder water at the ocean depths. Even a relatively simple robotic probe can generate additional electricity by changing its buoyancy to either float at the surface or sink down into the colder depths.

By attaching an external energy-harvesting module, Seatrec has already begun transforming robotic probes into assets that can be recharged and reused more affordably than sending out a ship each time to retrieve the probes. This renewable energy approach could keep such drones going almost indefinitely barring electrical or mechanical failures. “We just attach the backpack to the robots, we give them a cable providing power, and they go into the ocean,” Chao explains.

The early buyers of Seatrec’s products are primarily academic researchers who use underwater drones to collect oceanographic data. But the startup has also attracted military and government interest. It has already received small business innovation research contracts from both the U.S. Office of Naval Research and National Oceanic and Atmospheric Administration (NOAA).

Seatrec has also won two $10,000 prizes under the Powering the Blue Economy: Ocean Observing Prize administered by the U.S. Department of Energy and NOAA. The prizes awarded during the DISCOVER Competition phase back in March 2020 included one prize split with Northrop Grumman for the joint Mission Unlimited UUV Station concept. The startup and defense giant are currently looking for a robotics company to partner with for the DEVELOP Competition phase of the Ocean Observing Prize that will offer a total of $3 million in prizes.

In the long run, Seatrec hopes its energy-harvesting technology can support commercial ventures such as the aquaculture industry that operates vast underwater farms. The technology could also support underwater drones carrying out seabed surveys that pave the way for deep sea mining ventures, although those are not without controversy because of their projected environmental impacts.

Among all the possible applications Chao seems especially enthusiastic about the prospect of Seatrec’s renewable power technology enabling underwater drones and floaters to collect oceanographic data for much longer periods of time. He spent the better part of two decades working at the NASA Jet Propulsion Laboratory in Pasadena, Calif., where he helped develop a satellite designed for monitoring the Earth’s oceans. But he and the JPL engineering team that developed Seatrec’s core technology believe that swarms of underwater drones can provide a continuous monitoring network to truly begin understanding the oceans in depth.

The COVID-19 pandemic has slowed production and delivery of Seatrec’s products somewhat given local shutdowns and supply chain disruptions. Still, the startup has been able to continue operating in part because it’s considered to be a defense contractor that is operating an essential manufacturing facility. Seatrec’s engineers and other staff members are working in shifts to practice social distancing.

“Rather than building one or two for the government, we want to scale up to build thousands, hundreds of thousands, hopefully millions, so we can improve our understanding and provide that data to the community,” Chao says. Continue reading

Posted in Human Robots