Tag Archives: motor

#438553 New Drone Software Handles Motor ...

Good as some drones are becoming at obstacle avoidance, accidents do still happen. And as far as robots go, drones are very much on the fragile side of things. Any sort of significant contact between a drone and almost anything else usually results in a catastrophic, out-of-control spin followed by a death plunge to the ground. Bad times. Bad, expensive times.

A few years ago, we saw some interesting research into software that can keep the most common drone form factor, the quadrotor, aloft and controllable even after the failure of one motor. The big caveat to that software was that it relied on GPS for state estimation, meaning that without a GPS signal, the drone is unable to get the information it needs to keep itself under control. In a paper recently accepted to RA-L, researchers at the University of Zurich report that they have developed a vision-based system that brings state estimation completely on-board. The upshot: potentially any drone with some software and a camera can keep itself safe even under the most challenging conditions.

A few years ago, we wrote about first author Sihao Sun’s work on high speed controlled flight of a quadrotor with a non-functional motor. But that innovation relied on an external motion capture system. Since then, Sun has moved from Tu Delft to Davide Scaramuzza’s lab at UZH, and it looks like he’s been able to combine his work on controlled spinning flight with the Robotics and Perception Group’s expertise in vision. Now, a downward-facing camera is all it takes for a spinning drone to remain stable and controllable:

Remember, this software isn’t just about guarding against motor failure. Drone motors themselves don’t just up and fail all that often, either with respect to their software or hardware. But they do represent the most likely point of failure for any drone, usually because when you run into something, what ultimately causes your drone to crash is damage to a motor or a propeller that causes loss of control.

The reason that earlier solutions relied on GPS was because the spinning drone needs a method of state estimation—that is, in order to be closed-loop controllable, the drone needs to have a reasonable understanding of what its position is and how that position is changing over time. GPS is an easy way to take care of this, but GPS is also an external system that doesn’t work everywhere. Having a state estimation system that’s completely internal to the drone itself is much more fail safe, and Sun got his onboard system to work through visual feature tracking with a downward-facing camera, even as the drone is spinning at over 20 rad/s.

While the system works well enough with a regular downward-facing camera—something that many consumer drones are equipped with for stabilization purposes—replacing it with an event camera (you remember event cameras, right?) makes the performance even better, especially in low light.

For more details on this, including what you’re supposed to do with a rapidly spinning partially disabled quadrotor (as well as what it’ll take to make this a standard feature on consumer hardware), we spoke with Sihao Sun via email.

IEEE Spectrum: what usually happens when a drone spinning this fast lands? Is there any way to do it safely?

Sihao Sun: Our experience shows that we can safely land the drone while it is spinning. When the range sensor measurements are lower than a threshold (around 10 cm, indicating that the drone is close to the ground), we switch off the rotors. During the landing procedure, despite the fast spinning motion, the thrust direction oscillates around the gravity vector, thus the drone touches the ground with its legs without damaging other components.

Can your system handle more than one motor failure?

Yes, the system can also handle the failure of two opposing rotors. However, if two adjacent rotors or more than two rotors fail, our method cannot save the quadrotor. Some research has shown that it is possible to control a quadrotor with only one remaining rotor. But the drone requires a very special inertial property, which is hard to satisfy in real applications.

How different is your system's performance from a similar system that relies on GPS, in a favorable environment?

In a favorable environment, our system outperforms those relying on GPS signals because it obtains better position estimates. Since a damaged quadrotor spins fast, the accelerometer readings are largely affected by centrifugal forces. When the GPS signal is lost or degraded, a drone relying on GPS needs to integrate these biased accelerometer measurements for position estimation, leading to large position estimation errors. Feeding these erroneous estimates to the flight controller can easily crash the drone.

When you say that your solution requires “only onboard sensors and computation,” are those requirements specialized, or would they be generally compatible with the current generation of recreational and commercial quadrotors?

We use an NVIDIA Jetson TX2 to run our solution, which includes two parts: the control algorithm and the vision-based state estimation algorithm. The control algorithm is lightweight; thus, we believe that it is compatible with the current generation of quadrotors. On the other hand, the vision-based state estimation requires relatively more computational resources, which may not be affordable for cheap recreational platforms. But this is not an issue for commercial quadrotors because many of them have more powerful processors than a TX2.

What else can event cameras be used for, in recreational or commercial applications?

Many drone applications can benefit from event cameras, especially those in high-speed or low-light conditions, such as autonomous drone racing, cave exploration, drone delivery during night time, etc. Event cameras also consume very little power, which is a significant advantage for energy-critical missions, such as planetary aerial vehicles for Mars explorations. Regarding space applications, we are currently collaborating with JPL to explore the use of event cameras to address the key limitations of standard cameras for the next Mars helicopter.

[ UZH RPG ] Continue reading

Posted in Human Robots

#437990 Video Friday: Record-Breaking Drone Show ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online]
RoboSoft 2021 – April 12-16, 2021 – [Online]
Let us know if you have suggestions for next week, and enjoy today's videos.

A new parent STAR robot is presented. The parent robot has a tail on which the child robot can climb. By collaborating together, the two robots can reach locations that neither can reach on its own.

The parent robot can also supply the child robot with energy by recharging its batteries. The parent STAR can dispatch and recuperate the child STAR automatically (when aligned). The robots are fitted with sensors and controllers and have automatic capabilities but make no decisions on their own.

[ Bio-Inspired and Medical Robotics Lab ]

How TRI trains its robots.

[ TRI ]

The only thing more satisfying than one SCARA robot is two SCARA robots working together.

[ Fanuc ]

I'm not sure that this is strictly robotics, but it's so cool that it's worth a watch anyway.

[ Shinoda & Makino Lab ]

Flying insects heavily rely on optical flow for visual navigation and flight control. Roboticists have endowed small flying robots with optical flow control as well, since it requires just a tiny vision sensor. However, when using optical flow, the robots run into two problems that insects appear to have overcome. Firstly, since optical flow only provides mixed information on distances and velocities, using it for control leads to oscillations when getting closer to obstacles. Secondly, since optical flow provides very little information on obstacles in the direction of motion, it is hardest to detect obstacles that the robot is actually going to collide with! We propose a solution to these problems by means of a learning process.

[ Nature ]

A new Guinness World Record was set on Friday in north China for the longest animation performed by 600 unmanned aerial vehicles (UAVs).

[ Xinhua ]

Translucency is prevalent in everyday scenes. As such, perception of transparent objects is essential for robots to perform manipulation. In this work, we propose LIT, a two-stage method for transparent object pose estimation using light-field sensing and photorealistic rendering.

[ University of Michigan ] via [ Fetch Robotics ]

This paper reports the technological progress and performance of team “CERBERUS” after participating in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge.

And here's a video report on the SubT Urban Beta Course performance:

[ CERBERUS ]

Congrats to Energy Robotics on 2 million euros in seed funding!

[ Energy Robotics ]

Thanks Stefan!

In just 2 minutes, watch HEBI robotics spending 23 minutes assembling a robot arm.

HEBI Robotics is hosting a webinar called 'Redefining the Robotic Arm' next week, which you can check out at the link below.

[ HEBI Robotics ]

Thanks Hardik!

Achieving versatile robot locomotion requires motor skills which can adapt to previously unseen situations. We propose a Multi-Expert Learning Architecture (MELA) that learns to generate adaptive skills from a group of representative expert skills. During training, MELA is first initialised by a distinct set of pre-trained experts, each in a separate deep neural network (DNN). Then by learning the combination of these DNNs using a Gating Neural Network (GNN), MELA can acquire more specialised experts and transitional skills across various locomotion modes.

[ Paper ]

Since the dawn of history, advances in science and technology have pursued “power” and “accuracy.” Initially, “hardness” in machines and materials was sought for reliable operations. In our area of Science of Soft Robots, we have combined emerging academic fields aimed at “softness” to increase the exposure and collaboration of researchers in different fields.

[ Science of Soft Robots ]

A team from the Laboratory of Robotics and IoT for Smart Precision Agriculture and Forestry at INESC TEC – Technology and Science are creating a ROS stack solution using Husky UGV for precision field crop agriculture.

[ Clearpath Robotics ]

Associate Professor Christopher J. Hasson in the Department of Physical Therapy is the director Neuromotor Systems Laboratory at Northeastern University. There he is working with a robotic arm to provide enhanced assistance to physical therapy patients, while maintaining the intimate therapist and patient relationship.

[ Northeastern ]

Mobile Robotic telePresence (MRP) systems aim to support enhanced collaboration between remote and local members of a given setting. But MRP systems also put the remote user in positions where they frequently rely on the help of local partners. Getting or ‘recruiting’ such help can be done with various verbal and embodied actions ranging in explicitness. In this paper, we look at how such recruitment occurs in video data drawn from an experiment where pairs of participants (one local, one remote) performed a timed searching task.

[ Microsoft Research ]

A presentation [from Team COSTAR] for the American Geophysical Union annual fall meeting on the application of robotic multi-sensor 3D Mapping for scientific exploration of caves. Lidar-based 3D maps are combined with visual/thermal/spectral/gas sensors to provide rich 3D context for scientific measurements map.

[ COSTAR ] Continue reading

Posted in Human Robots

#437912 “Boston Dynamics Will Continue to ...

Last week’s announcement that Hyundai acquired Boston Dynamics from SoftBank left us with a lot of questions. We attempted to answer many of those questions ourselves, which is typically bad practice, but sometimes it’s the only option when news like that breaks.

Fortunately, yesterday we were able to speak with Michael Patrick Perry, vice president of business development at Boston Dynamics, who candidly answered our questions about Boston Dynamics’ new relationship with Hyundai and what the near future has in store.

IEEE Spectrum: Boston Dynamics is worth 1.1 billion dollars! Can you put that valuation into context for us?

Michael Patrick Perry: Since 2018, we’ve shifted to becoming a commercial organization. And that’s included a number of things, like taking our existing technology and bringing it to market for the first time. We’ve gone from zero to 400 Spot robots deployed, building out an ecosystem of software developers, sensor providers, and integrators. With that scale of deployment and looking at the pipeline of opportunities that we have lined up over the next year, I think people have started to believe that this isn’t just a one-off novelty—that there’s actual value that Spot is able to create. Secondly, with some of our efforts in the logistics market, we’re getting really strong signals both with our Pick product and also with some early discussions around Handle’s deployment in warehouses, which we think are going to be transformational for that industry.

So, the thing that’s really exciting is that two years ago, we were talking about this vision, and people said, “Wow, that sounds really cool, let’s see how you do.” And now we have the validation from the market saying both that this is actually useful, and that we’re able to execute. And that’s where I think we’re starting to see belief in the long-term viability of Boston Dynamics, not just as a cutting-edge research shop, but also as a business.

Photo: Boston Dynamics

Boston Dynamics says it has deployed 400 Spot robots, building out an “ecosystem of software developers, sensor providers, and integrators.”

How would you describe Hyundai’s overall vision for the future of robotics, and how do they want Boston Dynamics to fit into that vision?

In the immediate term, Hyundai’s focus is to continue our existing trajectories, with Spot, Handle, and Atlas. They believe in the work that we’ve done so far, and we think that combining with a partner that understands many of the industries in which we’re targeting, whether its manufacturing, construction, or logistics, can help us improve our products. And obviously as we start thinking about producing these robots at scale, Hyundai’s expertise in manufacturing is going to be really helpful for us.

Looking down the line, both Boston Dynamics and Hyundai believe in the value of smart mobility, and they’ve made a number of plays in that space. Whether it’s urban air mobility or autonomous driving, they’ve been really thinking about connecting the digital and the physical world through moving systems, whether that’s a car, a vertical takeoff and landing multi-rotor vehicle, or a robot. We are well positioned to take on robotics side of that while also connecting to some of these other autonomous services.

Can you tell us anything about the kind of robotics that the Hyundai Motor Group has going on right now?

So they’re working on a lot of really interesting stuff—exactly how that connects, you know, it’s early days, and we don’t have anything explicitly to share. But they’ve got a smart and talented robotics team that’s working in a variety of directions that shares overlap with us. Obviously, a lot of things related to autonomous driving shares some DNA with the work that we’re doing in autonomy for Spot and Handle, so it’s pretty exciting to see.

What are you most excited about here? How do you think this deal will benefit Boston Dynamics?

I think there are a number of things. One is that they have an expertise in hardware, in a way that’s unique. They understand and appreciate the complexity of creating large complex robotic systems. So I think there’s some shared understanding of what it takes to create a great hardware product. And then also they have the resources to help us actually build those products with them together—they have manufacturing resources and things like that.

“Robotics isn’t a short term game. We’ve scaled pretty rapidly but if you start looking at what the full potential of a company like Boston Dynamics is, it’s going to take years to realize, and I think Hyundai is committed to that long-term vision”

Another thing that’s exciting is that Hyundai has some pretty visionary bets for autonomous driving and unmanned aerial systems, and all of that fits very neatly into the connected vision of robotics that we were talking about before. Robotics isn’t a short term game. We’ve scaled pretty rapidly for a robotics company in terms of the scale of robots we’ve able to deploy in the field, but if you start looking at what the full potential of a company like Boston Dynamics is, it’s going to take years to realize, and I think Hyundai is committed to that long-term vision.

And when you’ve been talking with Hyundai, what are they most excited about?

I think they’re really excited about our existing products and our technology. Looking at some of the things that Spot, Pick, and Handle are able to do now, there are applications that many of Hyundai’s customers could benefit from in terms of mobility, remote sensing, and material handling. Looking down the line, Hyundai is also very interested in smart city technology, and mobile robotics is going to be a core piece of that.

We tend to focus on Spot and Handle and Atlas in terms of platform capabilities, but can you talk a bit about some of the component-level technology that’s unique to Boston Dynamics, and that could be of interest to Hyundai?

Creating very power-dense actuator design is something that we’ve been successful at for several years, starting back with BigDog and LS3. And Handle has some hydraulic actuators and valves that are pretty unique in terms of their design and capability. Fundamentally, we have a systems engineering approach that brings together both hardware and software internally. You’ll often see different groups that specialize in something, like great mechanical or electrical engineering groups, or great controls teams, but what I think makes Boston Dynamics so special is that we’re able to put everything on the table at once to create a system that’s incredibly capable. And that’s why with something like Spot, we’re able to produce it at scale, while also making it flexible enough for all the different applications that the robot is being used for right now.

It’s hard to talk specifics right now, but there are obviously other disciplines within mechanical engineering or electrical engineering or controls for robots or autonomous systems where some of our technology could be applied.

Photo: Boston Dynamics

Boston Dynamics is in the process of commercializing Handle, iterating on its design and planning to get box-moving robots on-site with customers in the next year or two.

While Boston Dynamics was part of Google, and then SoftBank, it seems like there’s been an effort to maintain independence. Is it going to be different with Hyundai? Will there be more direct integration or collaboration?

Obviously it’s early days, but right now, we have support to continue executing against all the plans that we have. That includes all the commercialization of Spot, as well as things for Atlas, which is really going to be pushing the capability of our team to expand into new areas. That’s going to be our immediate focus, and we don’t see anything that’s going to pull us away from that core focus in the near term.

As it stands right now, Boston Dynamics will continue to be Boston Dynamics under this new ownership.

How much of what you do at Boston Dynamics right now would you characterize as fundamental robotics research, and how much is commercialization? And how do you see that changing over the next couple of years?

We have been expanding our commercial team, but we certainly keep a lot of the core capabilities of fundamental robotics research. Some of it is very visible, like the new behavior development for Atlas where we’re pushing the limits of perception and path planning. But a lot of the stuff that we’re working on is a little bit under the hood, things that are less obvious—terrain handling, intervention handling, how to make safe faults, for example. Initially when Spot started slipping on things, it would flail around trying to get back up. We’ve had to figure out the right balance between the robot struggling to stand, and when it should decide to just lock its limbs and fall over because it’s safer to do that.

I’d say the other big thrust for us is manipulation. Our gripper for Spot is coming out early next year, and that’s going to unlock a new set of capabilities for us. We have years and years of locomotion experience, but the ability to manipulate is a space that’s still relatively new to us. So we’ve been ramping up a lot of work over the last several years trying to get to an early but still valuable iteration of the technology, and we’ll continue pushing on that as we start learning what’s most useful to our customers.

“I’d say the other big thrust for us is manipulation. Our gripper for Spot is coming out early next year, and that’s going to unlock a new set of capabilities for us. We have years and years of locomotion experience, but the ability to manipulate is a space that’s still relatively new to us”

Looking back, Spot as a commercial robot has a history that goes back to robots like LS3 and BigDog, which were very ambitious projects funded by agencies like DARPA without much in the way of commercial expectations. Do you think these very early stage, very expensive, very technical projects are still things that Boston Dynamics can take on?

Yes—I would point to a lot of the things we do with Atlas as an example of that. While we don’t have immediate plans to commercialize Atlas, we can point to technologies that come out of Atlas that have enabled some of our commercial efforts over time. There’s not necessarily a clear roadmap of how every piece of Atlas research is going to feed over into a commercial product; it’s more like, this is a really hard fundamental robotics challenge, so let’s tackle it and learn things that we can then benefit from across the company.

And fundamentally, our team loves doing cool stuff with robots, and you’ll continue seeing that in the months to come.

Photo: Boston Dynamics

Spot’s arm with gripper is coming out early next year, and Boston Dynamics says that’s going to “unlock a new set of capabilities for us.”

What would it take to commercialize Atlas? And are you getting closer with Handle?

We’re in the process of commercializing Handle. We’re at a relatively early stage, but we have a plan to get the first versions for box moving on-site with customers in the next year or two. Last year, we did some on-site deployments as proof-of-concept trials, and using the feedback from that, we did a new design pass on the robot, and we’re looking at increasing our manufacturing capability. That’s all in progress.

For Atlas, it’s like the Formula 1 of robots—you’re not going to take a Formula 1 car and try to make it less capable so that you can drive it on the road. We’re still trying to see what are some applications that would necessitate an energy and computationally intensive humanoid robot as opposed to something that’s more inherently stable. Trying to understand that application space is something that we’re interested in, and then down the line, we could look at creating new morphologies to help address specific applications. In many ways, Handle is the first version of that, where we said, “Atlas is good at moving boxes but it’s very complicated and expensive, so let’s create a simpler and smaller design that can achieve some of the same things.”

The press release mentioned a mobile robot for warehouses that will be introduced next year—is that Handle?

Yes, that’s the work that we’re doing on Handle.

As we start thinking about a whole robotic solution for the warehouse, we have to look beyond a high power, low footprint, dynamic platform like Handle and also consider things that are a little less exciting on video. We need a vision system that can look at a messy stack of boxes and figure out how to pick them up, we need an interface between a robot and an order building system—things where people might question why Boston Dynamics is focusing on them because it doesn’t fit in with our crazy backflipping robots, but it’s really incumbent on us to create that full end-to-end solution.

Are you confident that under Hyundai’s ownership, Boston Dynamics will be able to continue taking the risks required to remain on the cutting edge of robotics?

I think we will continue to push the envelope of what robots are capable of, and I think in the near term, you’ll be able to see that realized in our products and the research that we’re pushing forward with. 2021 is going to be a great year for us. Continue reading

Posted in Human Robots

#437905 New Deep Learning Method Helps Robots ...

One of the biggest things standing in the way of the robot revolution is their inability to adapt. That may be about to change though, thanks to a new approach that blends pre-learned skills on the fly to tackle new challenges.

Put a robot in a tightly-controlled environment and it can quickly surpass human performance at complex tasks, from building cars to playing table tennis. But throw these machines a curve ball and they’re in trouble—just check out this compilation of some of the world’s most advanced robots coming unstuck in the face of notoriously challenging obstacles like sand, steps, and doorways.

The reason robots tend to be so fragile is that the algorithms that control them are often manually designed. If they encounter a situation the designer didn’t think of, which is almost inevitable in the chaotic real world, then they simply don’t have the tools to react.

Rapid advances in AI have provided a potential workaround by letting robots learn how to carry out tasks instead of relying on hand-coded instructions. A particularly promising approach is deep reinforcement learning, where the robot interacts with its environment through a process of trial-and-error and is rewarded for carrying out the correct actions. Over many repetitions it can use this feedback to learn how to accomplish the task at hand.

But the approach requires huge amounts of data to solve even simple tasks. And most of the things we would want a robot to do are actually comprised of many smaller tasks—for instance, delivering a parcel involves learning how to pick an object up, how to walk, how to navigate, and how to pass an object to someone else, among other things.

Training all these sub-tasks simultaneously is hugely complex and far beyond the capabilities of most current AI systems, so many experiments so far have focused on narrow skills. Some have tried to train AI on multiple skills separately and then use an overarching system to flip between these expert sub-systems, but these approaches still can’t adapt to completely new challenges.

Building off this research, though, scientists have now created a new AI system that can blend together expert sub-systems specialized for a specific task. In a paper in Science Robotics, they explain how this allows a four-legged robot to improvise new skills and adapt to unfamiliar challenges in real time.

The technique, dubbed multi-expert learning architecture (MELA), relies on a two-stage training approach. First the researchers used a computer simulation to train two neural networks to carry out two separate tasks: trotting and recovering from a fall.

They then used the models these two networks learned as seeds for eight other neural networks specialized for more specific motor skills, like rolling over or turning left or right. The eight “expert networks” were trained simultaneously along with a “gating network,” which learns how to combine these experts to solve challenges.

Because the gating network synthesizes the expert networks rather than switching them on sequentially, MELA is able to come up with blends of different experts that allow it to tackle problems none could solve alone.

The authors liken the approach to training people in how to play soccer. You start out by getting them to do drills on individual skills like dribbling, passing, or shooting. Once they’ve mastered those, they can then intelligently combine them to deal with more dynamic situations in a real game.

After training the algorithm in simulation, the researchers uploaded it to a four-legged robot and subjected it to a battery of tests, both indoors and outdoors. The robot was able to adapt quickly to tricky surfaces like gravel or pebbles, and could quickly recover from being repeatedly pushed over before continuing on its way.

There’s still some way to go before the approach could be adapted for real-world commercially useful robots. For a start, MELA currently isn’t able to integrate visual perception or a sense of touch; it simply relies on feedback from the robot’s joints to tell it what’s going on around it. The more tasks you ask the robot to master, the more complex and time-consuming the training will get.

Nonetheless, the new approach points towards a promising way to make multi-skilled robots become more than the sum of their parts. As much fun as it is, it seems like laughing at compilations of clumsy robots may soon be a thing of the past.

Image Credit: Yang et al., Science Robotics Continue reading

Posted in Human Robots

#437884 Hyundai Buys Boston Dynamics for Nearly ...

This morning just after 3 a.m. ET, Boston Dynamics sent out a media release confirming that Hyundai Motor Group has acquired a controlling interest in the company that values Boston Dynamics at US $1.1 billion:

Under the agreement, Hyundai Motor Group will hold an approximately 80 percent stake in Boston Dynamics and SoftBank, through one of its affiliates, will retain an approximately 20 percent stake in Boston Dynamics after the closing of the transaction.

The release is very long, but does have some interesting bits—we’ll go through them, and talk about what this might mean for both Boston Dynamics and Hyundai.

We’ve asked Boston Dynamics for comment, but they’ve been unusually quiet for the last few days (I wonder why!). So at this point just keep in mind that the only things we know for sure are the ones in the release. If (when?) we hear anything from either Boston Dynamics or Hyundai, we’ll update this post.

The first thing to be clear on is that the acquisition is split between Hyundai Motor Group’s affiliates, including Hyundai Motor, Hyundai Mobis, and Hyundai Glovis. Hyundai Motor makes cars, Hyundai Mobis makes car parts and seems to be doing some autonomous stuff as well, and Hyundai Glovis does logistics. There are many other groups that share the Hyundai name, but they’re separate entities, at least on paper. For example, there’s a Hyundai Robotics, but that’s part of Hyundai Heavy Industries, a different company than Hyundai Motor Group. But for this article, when we say “Hyundai,” we’re talking about Hyundai Motor Group.

What’s in it for Hyundai?
Let’s get into the press release, which is filled with press release-y terms like “synergies” and “working together”—you can view the whole thing here—but still has some parts that convey useful info.

By establishing a leading presence in the field of robotics, the acquisition will mark another major step for Hyundai Motor Group toward its strategic transformation into a Smart Mobility Solution Provider. To propel this transformation, Hyundai Motor Group has invested substantially in development of future technologies, including in fields such as autonomous driving technology, connectivity, eco-friendly vehicles, smart factories, advanced materials, artificial intelligence (AI), and robots.

If Hyundai wants to be a “Smart Mobility Solution Provider” with a focus on vehicles, it really seems like there’s a whole bunch of other ways they could have spent most of a billion dollars that would get them there quicker. Will Boston Dynamics’ expertise help them develop autonomous driving technology? Sure, I guess, but why not just buy an autonomous car startup instead? Boston Dynamics is more about “robots,” which happens to be dead last on the list above.

There was some speculation a couple of weeks ago that Hyundai was going to try and leverage Boston Dynamics to make a real version of this hybrid wheeled/legged concept car, so if that’s what Hyundai means by “Smart Mobility Solution Provider,” then I suppose the Boston Dynamics acquisition makes more sense. Still, I think that’s unlikely, because it’s just a concept car, after all.

In addition to “smart mobility,” which seems like a longer-term goal for Hyundai, the company also mentions other, more immediate benefits from the acquisition:

Advanced robotics offer opportunities for rapid growth with the potential to positively impact society in multiple ways. Boston Dynamics is the established leader in developing agile, mobile robots that have been successfully integrated into various business operations. The deal is also expected to allow Hyundai Motor Group and Boston Dynamics to leverage each other’s respective strengths in manufacturing, logistics, construction and automation.

“Successfully integrated” might be a little optimistic here. They’re talking about Spot, of course, but I think the best you could say at this point is that Spot is in the middle of some promising pilot projects. Whether it’ll be successfully integrated in the sense that it’ll have long-term commercial usefulness and value remains to be seen. I’m optimistic about this as well, but Spot is definitely not there yet.

What does probably hold a lot of value for Hyundai is getting Spot, Pick, and perhaps even Handle into that “manufacturing, logistics, construction” stuff. This is the bread and butter for robots right now, and Boston Dynamics has plenty of valuable technology to offer in those spaces.

Photo: Bob O’Connor

Boston Dynamics is selling Spot for $74,500, shipping included.

Betting on Spot and Pick
With Boston Dynamics founder Marc Raibert’s transition to Chairman of the company, the CEO position is now occupied by Robert Playter, the long-time VP of engineering and more recently COO at Boston Dynamics. Here’s his statement from the release:

“Boston Dynamics’ commercial business has grown rapidly as we’ve brought to market the first robot that can automate repetitive and dangerous tasks in workplaces designed for human-level mobility. We and Hyundai share a view of the transformational power of mobility and look forward to working together to accelerate our plans to enable the world with cutting edge automation, and to continue to solve the world’s hardest robotics challenges for our customers.”

Whether Spot is in fact “the first robot that can automate repetitive and dangerous tasks in workplaces designed for human-level mobility” on the market is perhaps something that could be argued against, although I won’t. Whether or not it was the first robot that can do these kinds of things, it’s definitely not the only robot that do these kinds of things, and going forward, it’s going to be increasingly challenging for Spot to maintain its uniqueness.

For a long time, Boston Dynamics totally owned the quadruped space. Now, they’re one company among many—ANYbotics and Unitree are just two examples of other quadrupeds that are being successfully commercialized. Spot is certainly very capable and easy to use, and we shouldn’t underestimate the effort required to create a robot as complex as Spot that can be commercially used and supported. But it’s not clear how long they’ll maintain that advantage, with much more affordable platforms coming out of Asia, and other companies offering some unique new capabilities.

Photo: Boston Dynamics

Boston Dynamics’ Handle is an all-electric robot featuring a leg-wheel hybrid mobility system, a manipulator arm with a vacuum gripper, and a counterbalancing tail.

Boston Dynamics’ picking system, which stemmed from their 2019 acquisition of Kinema Systems, faces the same kinds of challenges—it’s very good, but it’s not totally unique.

Boston Dynamics produces highly capable mobile robots with advanced mobility, dexterity and intelligence, enabling automation in difficult, dangerous, or unstructured environments. The company launched sales of its first commercial robot, Spot in June of 2020 and has since sold hundreds of robots in a variety of industries, such as power utilities, construction, manufacturing, oil and gas, and mining. Boston Dynamics plans to expand the Spot product line early next year with an enterprise version of the robot with greater levels of autonomy and remote inspection capabilities, and the release of a robotic arm, which will be a breakthrough in mobile manipulation.

Boston Dynamics is also entering the logistics automation market with the industry leading Pick, a computer vision-based depalletizing solution, and will introduce a mobile robot for warehouses in 2021.

Huh. We’ll be trying to figure out what “greater levels of autonomy” means, as well as whether the “mobile robot for warehouses” is Handle, or something more like an autonomous mobile robot (AMR) platform. I’d honestly be surprised if Handle was ready for work outside of Boston Dynamics next year, and it’s hard to imagine how Boston Dynamics could leverage their expertise into the AMR space with something that wouldn’t just seem… Dull, compared to what they usually do. I hope to be surprised, though!

A new deep-pocketed benefactor

Hyundai Motor Group’s decision to acquire Boston Dynamics is based on its growth potential and wide range of capabilities.

“Wide range of capabilities” we get, but that other phrase, “growth potential,” has a heck of a lot wrapped up in it. At the moment, Boston Dynamics is nowhere near profitable, as far as we know. SoftBank acquired Boston Dynamics in 2017 for between one hundred and two hundred million, and over the last three years they’ve poured hundreds of millions more into Boston Dynamics.

Hyundai’s 80 percent stake just means that they’ll need to take over the majority of that support, and perhaps even increase it if Boston Dynamics’ growth is one of their primary goals. Hyundai can’t have a reasonable expectation that Boston Dynamics will be profitable any time soon; they’re selling Spots now, but it’s an open question whether Spot will manage to find a scalable niche in which it’ll be useful in the sort of volume that will make it a sustainable commercial success. And even if it does become a success, it seems unlikely that Spot by itself will make a significant dent in Boston Dynamics’ burn rate anytime soon. Boston Dynamics will have more products of course, but it’s going to take a while, and Hyundai will need to support them in the interim.

Depending on whether Hyundai views Boston Dynamics as a company that does research or a company that makes robots that are useful and profitable, it may be difficult for Boston Dynamics to justify the cost to develop the
next Atlas, when the
current one still seems so far from commercialization

It’s become clear that to sustain itself, Boston Dynamics needs a benefactor with very deep pockets and a long time horizon. Initially, Boston Dynamics’ business model (or whatever you want to call it) was to do bespoke projects for defense-ish folks like DARPA, but from what we understand Boston Dynamics stopped that sort of work after Google acquired them back in 2013. From one perspective, that government funding did exactly what it was supposed to do, which was to fund the development of legged robots through low TRLs (technology readiness levels) to the point where they could start to explore commercialization.

The question now, though, is whether Hyundai is willing to let Boston Dynamics undertake the kinds of low-TRL, high-risk projects that led from BigDog to LS3 to Spot, and from PETMAN to DRC Atlas to the current Atlas. So will Hyundai be cool about the whole thing and be the sort of benefactor that’s willing to give Boston Dynamics the resources that they need to keep doing what they’re doing, without having to answer too many awkward questions about things like practicality and profitability? Hyundai can certainly afford to do this, but so could SoftBank, and Google—the question is whether Hyundai will want to, over the length of time that’s required for the development of the kind of ultra-sophisticated robotics hardware that Boston Dynamics specializes in.

To put it another way: Depending whether Hyundai’s perspective on Boston Dynamics is as a company that does research or a company that makes robots that are useful and profitable, it may be difficult for Boston Dynamics to justify the cost to develop the next Atlas, when the current one still seems so far from commercialization.

Google, SoftBank, now Hyundai

Boston Dynamics possesses multiple key technologies for high-performance robots equipped with perception, navigation, and intelligence.

Hyundai Motor Group’s AI and Human Robot Interaction (HRI) expertise is highly synergistic with Boston Dynamics’s 3D vision, manipulation, and bipedal/quadruped expertise.

As it turns out, Hyundai Motors does have its own robotics lab, called Hyundai Motors Robotics Lab. Their website is not all that great, but here’s a video from last year:

I’m not entirely clear on what Hyundai means when they use the word “synergistic” when they talk about their robotics lab and Boston Dynamics, but it’s a little bit concerning. Usually, when a big company buys a little company that specializes in something that the big company is interested in, the idea is that the little company, to some extent, will be absorbed into the big company to give them some expertise in that area. Historically, however, Boston Dynamics has been highly resistant to this, maintaining its post-acquisition independence and appearing to be very reluctant to do anything besides what it wants to do, at whatever pace it wants to do it, and as by itself as possible.

From what we understand, Boston Dynamics didn’t integrate particularly well with Google’s robotics push in 2013, and we haven’t seen much evidence that SoftBank’s experience was much different. The most direct benefit to SoftBank (or at least the most visible one) was the addition of a fleet of Spot robots to the SoftBank Hawks baseball team cheerleading squad, along with a single (that we know about) choreographed gymnastics routine from an Atlas robot that was only shown on video.

And honestly, if you were a big manufacturing company with a bunch of money and you wanted to build up your own robotics program quickly, you’d probably have much better luck picking up some smaller robotics companies who were a bit less individualistic and would probably be more amenable to integration and would cost way less than a billion dollars-ish. And if integration is ultimately Hyundai’s goal, we’ll be very sad, because it’ll likely signal the end of Boston Dynamics doing the unfettered crazy stuff that we’ve grown to love.

Photo: Bob O’Connor

Possibly the most agile humanoid robot ever built, Atlas can run, climb, jump over obstacles, and even get up after a fall.

Boston Dynamics contemplates its future

The release ends by saying that the transaction is “subject to regulatory approvals and other customary closing conditions” and “is expected to close by June of 2021.” Again, you can read the whole thing here.

My initial reaction is that, despite the “synergies” described by Hyundai, it’s certainly not immediately obvious why the company wants to own 80 percent of Boston Dynamics. I’d also like a better understanding of how they arrived at the $1.1 billion valuation. I’m not saying this because I don’t believe in what Boston Dynamics is doing or in the inherent value of the company, because I absolutely do, albeit perhaps in a slightly less tangible sense. But when you start tossing around numbers like these, a big pile of expectations inevitably comes along with them. I hope that Boston Dynamics is unique enough that the kinds of rules that normally apply to robotics companies (or companies in general) can be set aside, at least somewhat, but I also worry that what made Boston Dynamics great was the explicit funding for the kinds of radical ideas that eventually resulted in robots like Atlas and Spot.

Can Hyundai continue giving Boston Dynamics the support and freedom that they need to keep doing the kinds of things that have made them legendary? I certainly hope so. Continue reading

Posted in Human Robots