Tag Archives: robotics
#439483 Zebra Technologies To Acquire Fetch ...
A company called Zebra Technologies announced this morning that it intends to acquire Fetch Robotics for an impressive $305 million.
Fetch is best known for its autonomous mobile robots (AMRs) for warehouses and boasts “the largest portfolio of AMRs in the industry,” and we’re particular fans of its one-armed mobile manipulator for research. Zebra, meanwhile, does stuff with barcodes (get it?), and has been actively investing in robotics companies with a goal of increasing its footprint in the intelligent industrial automation space.
According to the press release, the acquisition “will provide an innovative offering that drives greater efficiencies and higher ROI through better orchestration of technology and people.” We have no idea what that means, but fortunately, we’ve been able to speak with both Fetch and Zebra for details about the deal.
Fetch Robotics’ $305 million purchase price includes $290 million in cash to snap up the 95% of Fetch that Zebra doesn’t already own—Zebra had already invested in Fetch through Zebra Ventures, which also has Locus Robotics and Plus One robotics in its portfolio. There are still some “customary closing conditions” and regulatory approvals that need to happen, so everything isn’t expected to get wrapped up for another month or so. And when it does, it will in some ways mark the end of a robotics story that we’ve been following for the better part of a decade.
Fetch Robotics was founded in early 2015 by the same team of robot experts who had founded Unbounded Robotics just two years before. The founders all worked at Willow Garage, and Unbounded was a mobile manipulation-focused spin-off of Willow that didn’t pan out for reasons that are still not super clear. But in any case, Fetch was a fresh start that allowed them to fully develop their concept for an intelligent, robust, and efficient autonomous mobile robotic system.
Most of what Fetch Robotics does is warehouse logistics—moving stuff from one place to another so that humans don’t have to. Their autonomous mobile robots work outside of warehouses as well, most recently by providing disinfection services for places like airports. There are plenty of other companies in the larger AMR space, but from what we understand, what Fetch has been doing for the last five years has been consistently state of the art.
This is why Fetch makes sense as an acquisition target, I think: they’ve got exceptional technology in an area (fulfillment, mostly) that has been undergoing a huge amount of growth and where robotics has an enormous opportunity. But what about Zebra Technologies? As far as I can make out, Zebra is one of those companies that you’ve probably never heard of but is actually enormous and everywhere. According to Fortune, as of 2020 they were the 581st biggest company in the world (just behind Levi Strauss) with a market value of $25 billion. While Zebra was founded in 1969, the Zebra-ness didn’t come into play until the early 1980s when they started making barcode printers and scanners. They got into RFID in the early 2000s, and then acquired Motorola’s enterprise unit in 2014, giving Zebra a huge mobile technology portfolio.
To find out where robots fit into all of this, and to learn more about what this means for Fetch, we spoke with Melonee Wise, CEO of Fetch, and Jim Lawton, Vice President and General Manager of Robotics Automation at Zebra.
IEEE Spectrum: Can you tell us about Zebra’s background and interest in robotics?
Jim Lawton: Zebra is a combination of companies that have come together over time. Historically, we were a printing company that made barcode labels, and then we acquired a mobile computing business from Motorola, and today we have a variety of devices that do sensing, analyzing, and acting—we’ve been getting increasingly involved in automation in general.
A lot of our major customers are retailers, warehousing, transportation and logistics, or healthcare, and what we’ve heard a lot lately is that there is an increased pressure towards trying to figure out how to run a supply chain efficiently. Workflows have gotten much more complicated and many of our customers don't feel like they're particularly well equipped to sort through those challenges. They understand that there's an opportunity to do something significant with robots, but what does that look like? What are the right strategies? And they're asking us for help.
There are lots of AMR companies out there doing things that superficially seem similar, but what do you feel is special about Fetch?
Jim Lawton: I was at Universal Robots for a while, and at Rethink Robotics for a number of years, and designing and building robots and bringing them to market is really, really hard. The only way to pull it off is with an amazing team, and Melonee has done an extraordinarily outstanding job, pulling together a world class robotics team.
We had invested in Fetch Robotics a couple of years ago, so we've been working pretty closely together already. We invest in companies in part so that we can educate ourselves, but it's also an opportunity to see whether we’re a good fit with each other. Zebra is a technology and engineering oriented company, and Fetch is as well. With the best team, and the best robots, we just think there’s an outstanding opportunity that we haven’t necessarily found with other AMR companies.
What about for Fetch? Why is Zebra a good fit?
Melonee Wise: Over the last couple of years we have been slowly expanding the devices that we want to connect to, and the software ecosystems that we want to connect to, and Zebra has provided a lot of that synergy. We're constantly asked, can we get a robot to do something if we scan a barcode, or can we press a button on a tablet, and have a robot appear, things like that. Being able to deliver these kinds of end to end, fully encapsulated solutions that go beyond the robots and really solve the problems that customers are looking to solve—Zebra helps us do that.
And there's also an opportunity for us as a robotics startup to partner with a larger company to help us scale much more rapidly. That’s the other thing that’s really exciting for us—Zebra has a very strong business in warehousing and logistics. They’re an industry leader, and I think they can really help us get to the next level as a company.
Does that represent a transition for AMRs from just moving things from one place to another to integrating with all kinds of other warehouse systems?
Melonee Wise: For a decade or more, people have been talking about Industry 4.0 and how it's going to change the world and revolutionize manufacturing, but as a community we’ve struggled to execute on that goal for lots of reasons. We've had what people might call islands of automation: siloed pieces of automation that are doing their thing by themselves. But if they have to talk to each other, that's a bridge too far.
But in many ways automation technology is now getting mature enough through the things that we’ve seen in software for a long time, like APIs, interconnected services, and cloud platforms. Zebra has been working on that independently for a long time as part of their business, and so bringing our two businesses together to build these bridges between islands of automation is why it made sense for us to come together at this point in time.
If you go back far enough, Fetch has its origins in Willow Garage and ROS, and I know that Fetch still makes substantial software contributions back to the ROS community. Is that something you’ll be able to continue?
Melonee Wise: Our participation in the open source community is still very important, and I think it’s going to continue to be important. A lot of robotics is really about getting great talent, and open source is one way that we connect to that talent and participate in the larger ecosystem and draw value from it. There are also lots of great tools out there in the open source community that Fetch uses and contributes to. And I think those types of projects that are not core to our IP but give us value will definitely be things that we continue to participate in.
What will happen to the Fetch mobile manipulator that I know a lot of labs are currently using for research?
Melonee Wise: We're committed to continuing to support our existing customers and I think that there’s still a place for the research product going forward.
What do you think are the biggest challenges for AMRs right now?
Melonee Wise: One thing that I think is happening in the industry is that the safety standards are now coming into play. In December of last year the first official autonomous mobile robot safety standards were released, and not everyone was ready for that, but Fetch has been at the front of this for a long time. It took about four years to develop the AMR safety standard, and getting to an understanding of what safe actually means and how you implement those safety measures. It’s common for safety standards to lag behind technology, but customers have been asking more and more, “well how do I know that your robots are safe?” And so I think what we're going to see is that these safety standards are going to have differing effects on different companies, based on how thoughtful they've been about safety through the design and implementation of their technology,
What have you learned, or what has surprised you about your industry now that we’re a year and a half into the pandemic?
Melonee Wise: One of the more interesting things to me was that it was amazing how quickly the resistance to the cloud goes away when you have to deploy things remotely during a pandemic. Originally customers weren't that excited about the cloud and wanted to do everything on site, but once the pandemic hit they switched their point of view on the technology pretty quickly, which was nice to see.
Jim Lawton: The amount of interest that we've seen in robots and automation in general has skyrocketed over the last year. In particular we’re hearing from companies that are not well equipped to deal with their automation needs, and the pandemic has just made it so much more clear to them that they have to do something. I think we're going to see a renaissance within some of these spaces because of their investment in robotic technologies. Continue reading
#439465 Dextrous Robotics Wants To Move Boxes ...
Hype aside, there aren’t necessarily all that many areas where robots have the potential to step into an existing workflow and immediately provide a substantial amount of value. But one of the areas that we have seen several robotics companies jump into recently is box manipulation—specifically, using robots to unload boxes from the back of a truck, ideally significantly faster than a human. This is a good task for robots because it plays to their strengths: you can work in a semi-structured and usually predictable environment, speed, power, and precision are all valued highly, and it’s not a job that humans are particularly interested in or designed for.
One of the more novel approaches to this task comes from Dextrous Robotics, a Memphis TN-based startup led by Evan Drumwright. Drumwright was a professor at GWU before spending a few years at the Toyota Research Institute and then co-founding Dextrous in 2019 with an ex-student of his, Sam Zapolsky. The approach that they’ve come up with is to do box manipulation without any sort of suction, or really any sort of grippers at all. Instead, they’re using what can best be described as a pair of moving arms, each gripping a robotic chopstick.
We can pick up basically anything using chopsticks. If you're good with chopsticks, you can pick up individual grains of rice, and you can pick up things that are relatively large compared to the scale of the chopsticks. Your imagination is about the limit, so wouldn't it be cool if you had a robot that could manipulate things with chopsticks? —Evan Drumwright
It definitely is cool, but are there practical reasons why using chopsticks for box manipulation is a good idea? Of course there are! The nice thing about chopsticks is that they really can grip almost anything (even if you scale them up), making them especially valuable in constrained spaces where you’ve got large disparities in shapes and sizes and weights. They’re good for manipulation, too, able to nudge and reposition things with precision. And while Dextrous is initially focused on a trailer unloading task, having this extra manipulation capability will allow them to consider more difficult manipulation tasks in the future, like trailer loading, a task that necessarily happens just as often as unloading does but which is significantly more complicated to robot-ize.
Even though there are some clear advantages to Dextrous’ chopstick technique, there are disadvantages as well, and the biggest one is likely that it’s just a lot harder to use a manipulation technique like this. “The downside of the chopsticks approach is, as any human will tell you, you need some sophisticated control software to be able to operate,” Drumwright tells us. “But that’s part of what we bring to the game: not just a clever hardware design, but the software to operate it, too.”
Meanwhile, what we’ve seen so far from other companies in this space is pretty consistent use of suction systems for box handling. If you have a flat, non-permeable surface (as with most boxes), suction can work quickly and reliably and with a minimum of fancy planning. However, suction has limits form of manipulation, because it’s inherently so sticky, meaning that it can be difficult and/or time consuming to do anything with precision. Other issues with suction include its sensitivity to temperature and moisture, its propensity to ingest all the dirt it possibly can, and the fact that you need to design the suction array based on the biggest and heaviest things that you anticipate having to deal with. That last thing is a particular problem because if you also want to manipulate smaller objects, you’re left trying to do so with a suction array that’s way bigger than you’d like it to be. This is not to say that suction is inferior in all cases, and Drumwright readily admits that suction will probably prove to be a good option for some specific tasks. But chopstick manipulation, if they can get it to work, will be a lot more versatile.
Dextrous Robotics co-founders Evan Drumwright and Sam Zapolsky.
Photo: Dextrous Robotics
I think there's a reason that nature has given us hands. Nature knows how to design suction devices—bats have it, octopi have it, frogs have it—and yet we have hands. Why? Hands are a superior instrument. And so, that's why we've gone down this road. I personally believe, based on billions of years of evolution, that there's a reason that manipulation is superior and that that technology is going to win out. —Evan Drumwright
Part of Dextrous’ secret sauce is an emphasis on simulation. Hardware is hard, so ideally, you want to make one thing that just works the first time, rather than having to iterate over and over. Getting it perfect on the first try is probably unrealistic, but the better you can simulate things in advance, the closer you can get. “What we’ve been able to do is set up our entire planning perception and control system so that it looks exactly like it does when that code runs on the real robot,” says Drumwright. “When we run something on the simulated robot, it agrees with reality about 95 percent of the time, which is frankly unprecedented.” Using very high fidelity hardware modeling, a real time simulator, and software that can directly transfer between sim and real, Dextrous is able to confidently model how their system performs even on notoriously tricky things to simulate, like contact and stiction. The idea is that the end result will be a system that can be developed faster while performing more complex tasks better than other solutions.
We were also wondering why this system uses smooth round chopsticks rather than something a little bit grippier, like chopsticks with a square cross section, and maybe with some higher friction something on the inside surface. Drumwright explains that the advantage of the current design is that it’s symmetrical around its rotational axis, meaning that you only need five degrees of freedom to fully control it. “What that means practically is that things can get a whole lot simpler—the control algorithms get simpler, the inverse kinematics algorithms get simpler, and importantly the number of motors that we need to drive in the robot goes down.”
Simulated version of Dextrous Robotics’ hardware.
Screenshot: Dextrous Robotics
Dextrous took seed funding 18 months ago, and since then they’ve been working on both the software and hardware for their system as well as finding the time to score an NSF SBIR phase 1 grant. The above screenshot shows the simulation of the hardware they’re working towards (chopstick manipulators on two towers that can move laterally), while the Franka Panda arms are what they’re using to validate their software in the meantime. New hardware should be done imminently, and over the next year, Dextrous is looking forward to conducting paid pilots with real customers. Continue reading
#439451 12 Robotics Teams Will Hunt For ...
Last week, DARPA announced the twelve teams who will be competing in the Virtual Track of the DARPA Subterranean Challenge Finals, scheduled to take place in September in Louisville, KY. The robots and the environment may be virtual, but the prize money is very real, with $1.5 million of DARPA cash on the table for the teams who are able to find the most subterranean artifacts in the shortest amount of time.
You can check out the list of Virtual Track competitors here, but we’ll be paying particularly close attention to Team Coordinated Robotics and Team BARCS, who have been trading first and second place back and forth across the three previous competitions. But there are many other strong contenders, and since nearly a year will have passed between the Final and the previous Cave Circuit, there’s been plenty of time for all teams to have developed creative new ideas and improvements.
As a quick reminder, the SubT Final will include elements of tunnels, caves, and the urban underground. As before, teams will be using simulated models of real robots to explore the environment looking for artifacts (like injured survivors, cell phones, backpacks, and even hazardous gas), and they’ll have to manage things like austere navigation, degraded sensing and communication, dynamic obstacles, and rough terrain.
While we’re not sure exactly what the Virtual Track is going to look like, one of the exciting aspects of a virtual competition like this is how DARPA is not constrained by things like available physical space or funding. They could make a virtual course that incorporates the inside of the Egyptian pyramids, the Cheyenne Mountain military complex, and my basement, if they were so inclined. We are expecting a combination of the overall themes of the three previous virtual courses (tunnel, cave, and urban), but connected up somehow, and likely with a few surprises thrown in for good measure.
To some extent, the Virtual Track represents the best case scenario for SubT robots, in the sense that fewer things will just spontaneously go wrong. This is something of a compromise, since things very often spontaneously go wrong when you’re dealing with real robots in the real world. This is not to diminish the challenges of the Virtual Track in the least—even the virtual robots aren’t invincible, and their software will need to keep them from running into simulated walls or falling down simulated stairs. But as far as I know, the virtual robots will not experience damage during transport to the event, electronics shorting, motors burning out, emergency stop buttons being accidentally pressed, and that sort of thing. If anything, this makes the Virtual Track more exciting to watch, because you’re seeing teams of virtual robots on their absolute best behavior challenging each other primarily on the cleverness and efficiency of their programmers.
The other reason that the Virtual Track is more exciting is that unlike the Systems Track, there are no humans in the loop at all. Teams submit their software to DARPA, and then sit back and relax (or not) and watch their robots compete all by themselves in real time. This is a hugely ambitious way to do things, because a single human even a little bit in the loop can provide the kind of critical contextual world knowledge and mission perspective that robots often lack. A human in there somewhere is fine in the near to medium term, but full autonomy is the dream.
As for the Systems Track (which involves real robots on the physical course in Louisville), we’re not yet sure who all of the final competitors will be. The pandemic has made travel complicated, and some international teams aren’t yet sure whether they’ll be able to make it. Either way, we’ll be there at the end of September, when we’ll be able to watch both the Systems and Virtual Track teams compete for the SubT Final championship. Continue reading
#439437 Google parent launches new ...
Google's parent Alphabet unveiled a new “moonshot” project to develop software for robotics which could be used in a wide range of industries. Continue reading
#439429 12 Robotics Teams Will Hunt For ...
Last week, DARPA announced the twelve teams who will be competing in the Virtual Track of the DARPA Subterranean Challenge Finals, scheduled to take place in September in Louisville, KY. The robots and the environment may be virtual, but the prize money is very real, with $1.5 million of DARPA cash on the table for the teams who are able to find the most subterranean artifacts in the shortest amount of time.
You can check out the list of Virtual Track competitors here, but we’ll be paying particularly close attention to Team Coordinated Robotics and Team BARCS, who have been trading first and second place back and forth across the three previous competitions. But there are many other strong contenders, and since nearly a year will have passed between the Final and the previous Cave Circuit, there’s been plenty of time for all teams to have developed creative new ideas and improvements.
As a quick reminder, the SubT Final will include elements of tunnels, caves, and the urban underground. As before, teams will be using simulated models of real robots to explore the environment looking for artifacts (like injured survivors, cell phones, backpacks, and even hazardous gas), and they’ll have to manage things like austere navigation, degraded sensing and communication, dynamic obstacles, and rough terrain.
While we’re not sure exactly what the Virtual Track is going to look like, one of the exciting aspects of a virtual competition like this is how DARPA is not constrained by things like available physical space or funding. They could make a virtual course that incorporates the inside of the Egyptian pyramids, the Cheyenne Mountain military complex, and my basement, if they were so inclined. We are expecting a combination of the overall themes of the three previous virtual courses (tunnel, cave, and urban), but connected up somehow, and likely with a few surprises thrown in for good measure.
To some extent, the Virtual Track represents the best case scenario for SubT robots, in the sense that fewer things will just spontaneously go wrong. This is something of a compromise, since things very often spontaneously go wrong when you’re dealing with real robots in the real world. This is not to diminish the challenges of the Virtual Track in the least—even the virtual robots aren’t invincible, and their software will need to keep them from running into simulated walls or falling down simulated stairs. But as far as I know, the virtual robots will not experience damage during transport to the event, electronics shorting, motors burning out, emergency stop buttons being accidentally pressed, and that sort of thing. If anything, this makes the Virtual Track more exciting to watch, because you’re seeing teams of virtual robots on their absolute best behavior challenging each other primarily on the cleverness and efficiency of their programmers.
The other reason that the Virtual Track is more exciting is that unlike the Systems Track, there are no humans in the loop at all. Teams submit their software to DARPA, and then sit back and relax (or not) and watch their robots compete all by themselves in real time. This is a hugely ambitious way to do things, because a single human even a little bit in the loop can provide the kind of critical contextual world knowledge and mission perspective that robots often lack. A human in there somewhere is fine in the near to medium term, but full autonomy is the dream.
As for the Systems Track (which involves real robots on the physical course in Louisville), we’re not yet sure who all of the final competitors will be. The pandemic has made travel complicated, and some international teams aren’t yet sure whether they’ll be able to make it. Either way, we’ll be there at the end of September, when we’ll be able to watch both the Systems and Virtual Track teams compete for the SubT Final championship. Continue reading