Tag Archives: spot

#439736 Spot’s 3.0 Update Adds Increased ...

While Boston Dynamics' Atlas humanoid spends its time learning how to dance and do parkour, the company's Spot quadruped is quietly getting much better at doing useful, valuable tasks in commercial environments. Solving tasks like dynamic path planning and door manipulation in a way that's robust enough that someone can buy your robot and not regret it is, I would argue, just as difficult (if not more difficult) as getting a robot to do a backflip.
With a short blog post today, Boston Dynamics is announcing Spot Release 3.0, representing more than a year of software improvements over Release 2.0 that we covered back in May of 2020. The highlights of Release 3.0 include autonomous dynamic replanning, cloud integration, some clever camera tricks, and a new ability to handle push-bar doors, and earlier today, we spoke with Spot Chief Engineer at Boston Dynamics Zachary Jackowski to learn more about what Spot's been up to.
Here are some highlights from Spot's Release 3.0 software upgrade today, lifted from this blog post which has the entire list:
Mission planning: Save time by selecting which inspection actions you want Spot to perform, and it will take the shortest path to collect your data.Dynamic replanning: Don't miss inspections due to changes on site. Spot will replan around blocked paths to make sure you get the data you need.Repeatable image capture: Capture the same image from the same angle every time with scene-based camera alignment for the Spot CAM+ pan-tilt-zoom (PTZ) camera. Cloud-compatible: Connect Spot to AWS, Azure, IBM Maximo, and other systems with existing or easy-to-build integrations.Manipulation: Remotely operate the Spot Arm with ease through rear Spot CAM integration and split-screen view. Arm improvements also include added functionality for push-bar doors, revamped grasping UX, and updated SDK.Sounds: Keep trained bystanders aware of Spot with configurable warning sounds.The focus here is not just making Spot more autonomous, but making Spot more autonomous in some very specific ways that are targeted towards commercial usefulness. It's tempting to look at this stuff and say that it doesn't represent any massive new capabilities. But remember that Spot is a product, and its job is to make money, which is an enormous challenge for any robot, much less a relatively expensive quadruped.

For more details on the new release and a general update about Spot, we spoke with Zachary Jackowski, Spot Chief Engineer at Boston Dynamics.
IEEE Spectrum: So what's new with Spot 3.0, and why is this release important?
Zachary Jackowski: We've been focusing heavily on flexible autonomy that really works for our industrial customers. The thing that may not quite come through in the blog post is how iceberg-y making autonomy work on real customer sites is. Our blog post has some bullet points about “dynamic replanning” in maybe 20 words, but in doing that, we actually reengineered almost our entire autonomy system based on the failure modes of what we were seeing on our customer sites.
The biggest thing that changed is that previously, our robot mission paradigm was a linear mission where you would take the robot around your site and record a path. Obviously, that was a little bit fragile on complex sites—if you're on a construction site and someone puts a pallet in your path, you can't follow that path anymore. So we ended up engineering our autonomy system to do building scale mapping, which is a big part of why we're calling it Spot 3.0. This is state-of-the-art from an academic perspective, except that it's volume shipping in a real product, which to me represents a little bit of our insanity.
And one super cool technical nugget in this release is that we have a powerful pan/tilt/zoom camera on the robot that our customers use to take images of gauges and panels. We've added scene-based alignment and also computer vision model-based alignment so that the robot can capture the images from the same perspective, every time, perfectly framed. In pictures of the robot, you can see that there's this crash cage around the camera, but the image alignment stuff actually does inverse kinematics to command the robot's body to shift a little bit if the cage is including anything important in the frame.
When Spot is dynamically replanning around obstacles, how much flexibility does it have in where it goes?
There are a bunch of tricks to figuring out when to give up on a blocked path, and then it's very simple run of the mill route planning within an existing map. One of the really big design points of our system, which we spent a lot of time talking about during the design phase, is that it turns out in these high value facilities people really value predictability. So it's not desired that the robot starts wandering around trying to find its way somewhere.
Do you think that over time, your customers will begin to trust the robot with more autonomy and less predictability?
I think so, but there's a lot of trust to be built there. Our customers have to see the robot to do the job well for a significant amount of time, and that will come.
Can you talk a bit more about trying to do state-of-the-art work on a robot that's being deployed commercially?
I can tell you about how big the gap is. When we talk about features like this, our engineers are like, “oh yeah I could read this paper and pull this algorithm and code something up over a weekend and see it work.” It's easy to get a feature to work once, make a really cool GIF, and post it to the engineering group chat room. But if you take a look at what it takes to actually ship a feature at product-level, we're talking person-years to have it reach the level of quality that someone is accustomed to buying an iPhone and just having it work perfectly all the time. You have to write all the code to product standards, implement all your tests, and get everything right there, and then you also have to visit a lot of customers, because the thing that's different about mobile robotics as a product is that it's all about how the system responds to environments that it hasn't seen before.
The blog post calls Spot 3.0 “A Sensing Solution for the Real World.” What is the real world for Spot at this point, and how will that change going forward?
For Spot, 'real world' means power plants, electrical switch yards, chemical plants, breweries, automotive plants, and other living and breathing industrial facilities that have never considered the fact that a robot might one day be walking around in them. It's indoors, it's outdoors, in the dark and in direct sunlight. When you're talking about the geometric aspect of sites, that complexity we're getting pretty comfortable with.
I think the frontiers of complexity for us are things like, how do you work in a busy place with lots of untrained humans moving through it—that's an area where we're investing a lot, but it's going to be a big hill to climb and it'll take a little while before we're really comfortable in environments like that. Functional safety, certified person detectors, all that good stuff, that's a really juicy unsolved field.
Spot can now open push-bar doors, which seems like an easier problem than doors with handles, which Spot learned to open a while ago. Why'd you start with door handles first?
Push-bar doors is an easier problem! But being engineers, we did the harder problem first, because we wanted to get it done. Continue reading

Posted in Human Robots

#439479 Video Friday: Spot Meets BTS

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RSS 2021 – July 12-16, 2021 – [Online Event]
Humanoids 2020 – July 19-21, 2021 – [Online Event]
RO-MAN 2021 – August 8-12, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

I will never understand why video editors persist in adding extra noise to footage of actual robots that makes them sound like they are badly designed and/or are broken.

11 million people now think that's what Spot actually sounds like.

[ Hyundai ]

For one brief exciting moment this looks like a Spot with five arms.

[ Boston Dynamics ]

Researchers from Baidu Research and the University of Maryland have developed a robotic excavator system that integrates perception, planning, and control capabilities to enable material loading over a long duration with no human intervention.

[ Baidu ]

The Robotics and Perception Group and the University of Zurich present one of the world’s largest indoor drone-testing arenas. Equipped with a real-time motion-capture system consisting of 36 Vicon cameras, and with a flight space of over 30x30x8 meters (7,000 cubic meters), this large research infrastructure allows us to deploy our most advanced perception, learning, planning, and control algorithms to push vision-based agile drones to speeds over 60 km/h and accelerations over 5g.

[ RPG ]

Jump navigation for Mini Cheetah from UC Berkeley.

[ UC Berkeley ]

NASA’s Perseverance rover captured a historic group selfie with the Ingenuity Mars Helicopter on April 6, 2021. But how was the selfie taken? Vandi Verma, Perseverance’s chief engineer for robotic operations at NASA’s Jet Propulsion Laboratory in Southern California breaks down the process in this video.

[ NASA ]

I am like 95% sure that Heineken's cooler robot is mostly just a cut down Segway Ninebot.

[ Heineken ]

Wing has a new airspace safety and authorization app called OpenSky. It is not good in the same way that all of these airspace safety and authorization apps are not good: they only provide airspace information, and do not provide any guidance on other regulations that may impact your ability to fly a drone, while simultaneously making explicit suggestions about how all you need to fly is a green checkmark in the app, which is a lie.

At least it's free, I guess.

[ OpenSky ]

Interesting approach to conveyors from Berkshire Grey.

Where do I get one of them flower cows?

[ OpenSky ]

The idea behind RoboCup has always been to challenge humans at some point, and one of the first steps towards that is being able to recognize humans and what they're doing on the field.

[ Tech United Eindhoven ]

Sawyer is still very much around, but very much in Germany.

[ Rethink Robotics ]

The VoloDrone, Volocopter's heavy-lift and versatile cargo drone, is fully electric, can transport a 200 kg payload up to 40 km, and has 18 rotors and motors powering the electric vertical take-off and landing (eVTOL) aircraft. This innovative urban air mobility solution for intracity logistics will operate within Volocopter's UAM ecosystem for cities. 

[ Volocopter ]

Our technology can be used for remote maintenance tasks—perfect for when you can’t get on-site either because it’s too far, too dangerous or inaccessible. The system increases your speed of response to faults and failures which saves time, money and reputation. In this clip, our engineer is controlling the robot hands from a distance to plug in and take out a USB from its port.

In this clip, our engineer is controlling the robot hands from a distance to plug in and take out a USB from its port. How much extra for a robotic system that can insert a USB plug the correct way every time?

[ Shadow ]

Takenaka Corporation is one of five major general contractors in Japan. The company is welding structural columns in skyscrapers. Fraunhofer IPA developed a prototype and software for autonomous robotic welding on construction sites. The included robot programming system is based on ROS for collision-avoidance, laser-scanner based column localization and tool-changer handling.

[ Fraunhofer ]

Thanks, Jennifer!

In the near future, mixed traffic consisting of manual and autonomous vehicles (AVs) will be common. Questions surrounding how vulnerable road users such as pedestrians in wheelchairs (PWs) will make crossing decisions in these new situations are underexplored. We conducted a remote co-design study with one of the researchers of this work who has the lived experience as a powered wheelchair user and applied inclusive design practices.

[ Paper ]

The IEEE RAS Women in Engineering (WIE) Committee recently completed a several year study of gender representation in conference leading roles at RAS-supported conferences. Individuals who hold these roles select organizing committees, choose speakers, and make final decisions on paper acceptances. The authors lead a discussion about the findings and the story behind the study. In addition to presenting detailed data and releasing anonymized datasets for further study, the authors provided suggestions on changes to help ensure a more diverse and representative robotics community where anyone can thrive.

[ WIE ]

Service robots are entering all kinds of business areas, and the outbreak of COVID-19 speeds up their application. Many studies have shown that robots with matching gender-occupational roles receive larger acceptance. However, this can also enlarge the gender bias in society. In this paper, we identified gender norms embedded in service robots by iteratively coding 67 humanoid robot images collected from the Chinese e-commerce platform Alibaba.

[ Paper ]

Systems with legs and arms are becoming increasingly useful and applicable in real world scenarios. So far, in particular for locomotion, most control approaches have focused on using simplified models for online motion and foothold generation. This approach has its limits when dealing with complex robots that are capable of locomotion and manipulation. In this presentation I will show how we apply MPC for locomotion and manipulation with different variants of our quadrupedal robot ANYmal.

[ CMU ]

Thanks, Fan!

Pieter Abbeel's CVPR 2021 Keynote: Towards a General Solution for Robotics.

[ Pieter Abbeel ]

In this Weekly Robotics Meetup, Achille Verheye explains how he stumbled upon a very niche class of robots called cuspidal robots, capable of making singularity-avoiding moves while creating motion planning algorithms.

[ Weekly Robotics ]

Thanks, Mat! Continue reading

Posted in Human Robots

#439389 Video Friday: Spot Meets BTS

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RSS 2021 – July 12-16, 2021 – [Online Event]
Humanoids 2020 – July 19-21, 2021 – [Online Event]
RO-MAN 2021 – August 8-12, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

I will never understand why video editors persist in adding extra noise to footage of actual robots that makes them sound like they are badly designed and/or are broken.

11 million people now think that's what Spot actually sounds like.

[ Hyundai ]

For one brief exciting moment this looks like a Spot with five arms.

[ Boston Dynamics ]

Researchers from Baidu Research and the University of Maryland have developed a robotic excavator system that integrates perception, planning, and control capabilities to enable material loading over a long duration with no human intervention.

[ Baidu ]

The Robotics and Perception Group and the University of Zurich present one of the world’s largest indoor drone-testing arenas. Equipped with a real-time motion-capture system consisting of 36 Vicon cameras, and with a flight space of over 30x30x8 meters (7,000 cubic meters), this large research infrastructure allows us to deploy our most advanced perception, learning, planning, and control algorithms to push vision-based agile drones to speeds over 60 km/h and accelerations over 5g.

[ RPG ]

Jump navigation for Mini Cheetah from UC Berkeley.

[ UC Berkeley ]

NASA’s Perseverance rover captured a historic group selfie with the Ingenuity Mars Helicopter on April 6, 2021. But how was the selfie taken? Vandi Verma, Perseverance’s chief engineer for robotic operations at NASA’s Jet Propulsion Laboratory in Southern California breaks down the process in this video.

[ NASA ]

I am like 95% sure that Heineken's cooler robot is mostly just a cut down Segway Ninebot.

[ Heineken ]

Wing has a new airspace safety and authorization app called OpenSky. It is not good in the same way that all of these airspace safety and authorization apps are not good: they only provide airspace information, and do not provide any guidance on other regulations that may impact your ability to fly a drone, while simultaneously making explicit suggestions about how all you need to fly is a green checkmark in the app, which is a lie.

At least it's free, I guess.

[ OpenSky ]

Interesting approach to conveyors from Berkshire Grey.

Where do I get one of them flower cows?

[ OpenSky ]

The idea behind RoboCup has always been to challenge humans at some point, and one of the first steps towards that is being able to recognize humans and what they're doing on the field.

[ Tech United Eindhoven ]

Sawyer is still very much around, but very much in Germany.

[ Rethink Robotics ]

The VoloDrone, Volocopter's heavy-lift and versatile cargo drone, is fully electric, can transport a 200 kg payload up to 40 km, and has 18 rotors and motors powering the electric vertical take-off and landing (eVTOL) aircraft. This innovative urban air mobility solution for intracity logistics will operate within Volocopter's UAM ecosystem for cities. 

[ Volocopter ]

Our technology can be used for remote maintenance tasks—perfect for when you can’t get on-site either because it’s too far, too dangerous or inaccessible. The system increases your speed of response to faults and failures which saves time, money and reputation. In this clip, our engineer is controlling the robot hands from a distance to plug in and take out a USB from its port.

In this clip, our engineer is controlling the robot hands from a distance to plug in and take out a USB from its port. How much extra for a robotic system that can insert a USB plug the correct way every time?

[ Shadow ]

Takenaka Corporation is one of five major general contractors in Japan. The company is welding structural columns in skyscrapers. Fraunhofer IPA developed a prototype and software for autonomous robotic welding on construction sites. The included robot programming system is based on ROS for collision-avoidance, laser-scanner based column localization and tool-changer handling.

[ Fraunhofer ]

Thanks, Jennifer!

In the near future, mixed traffic consisting of manual and autonomous vehicles (AVs) will be common. Questions surrounding how vulnerable road users such as pedestrians in wheelchairs (PWs) will make crossing decisions in these new situations are underexplored. We conducted a remote co-design study with one of the researchers of this work who has the lived experience as a powered wheelchair user and applied inclusive design practices.

[ Paper ]

The IEEE RAS Women in Engineering (WIE) Committee recently completed a several year study of gender representation in conference leading roles at RAS-supported conferences. Individuals who hold these roles select organizing committees, choose speakers, and make final decisions on paper acceptances. The authors lead a discussion about the findings and the story behind the study. In addition to presenting detailed data and releasing anonymized datasets for further study, the authors provided suggestions on changes to help ensure a more diverse and representative robotics community where anyone can thrive.

[ WIE ]

Service robots are entering all kinds of business areas, and the outbreak of COVID-19 speeds up their application. Many studies have shown that robots with matching gender-occupational roles receive larger acceptance. However, this can also enlarge the gender bias in society. In this paper, we identified gender norms embedded in service robots by iteratively coding 67 humanoid robot images collected from the Chinese e-commerce platform Alibaba.

[ Paper ]

Systems with legs and arms are becoming increasingly useful and applicable in real world scenarios. So far, in particular for locomotion, most control approaches have focused on using simplified models for online motion and foothold generation. This approach has its limits when dealing with complex robots that are capable of locomotion and manipulation. In this presentation I will show how we apply MPC for locomotion and manipulation with different variants of our quadrupedal robot ANYmal.

[ CMU ]

Thanks, Fan!

Pieter Abbeel's CVPR 2021 Keynote: Towards a General Solution for Robotics.

[ Pieter Abbeel ]

In this Weekly Robotics Meetup, Achille Verheye explains how he stumbled upon a very niche class of robots called cuspidal robots, capable of making singularity-avoiding moves while creating motion planning algorithms.

[ Weekly Robotics ]

Thanks, Mat! Continue reading

Posted in Human Robots

#439247 Drones and Sensors Could Spot Fires ...

The speed at which a wildfire can rip through an area and wreak havoc is nothing short of awe-inspiring and terrifying. Early detection of these events is critical for fire management efforts, whether that involves calling in firefighters or evacuating nearby communities.

Currently, early fire detection in remote areas is typically done by satellite—but this approach can be hindered by cloud cover. What’s more, even the most advanced satellite systems detect fires once the burning area reaches an average seize of 18.4 km2 (7.1 square miles).

To detect wildfires earlier on, some researchers are proposing a novel solution that harnesses a network of Internet of Things (IoT) sensors and a fleet of drones, or unmanned aerial vehicles (UAVs). The researchers tested their approach through simulations, described in a study published May 5 in IEEE Internet of Things Journal, finding that it can detect fires that are just 2.5 km2 (just under one square mile) in size with near perfect accuracy.

Their idea is timely, as climate change is driving an increase in wildfires around many regions of the world, as seen recently in California and Australia.

“In the last few years, the number, frequency, and severity of wildfires have increased dramatically worldwide, significantly impacting countries’ economies, ecosystems, and communities. Wildfire management presents a significant challenge in which early fire detection is key,” emphasizes Osama Bushnaq, a senior researcher at the Autonomous Robotics Research Center of the Technology Innovation Institute in Abu Dhabi, who was involved in the study.

The approach that Bushnaq and his colleagues are proposing involves a network of IoT sensors scattered throughout regions of concern, such as a national park or forests situated near communities. If a fire ignites, IoT devices deployed in the area will detect it and wait until a patrolling UAV is within transmission range to report their measurements. If a UAV receives multiple positive detections by the IoT devices, it will notify the nearby firefighting department that a wildfire has been verified.

The researchers evaluated a number of different UAVs and IoT sensors based on cost and features to determine the optimal combinations. Next, they tested their UAV-IoT approach through simulations, whereby 420 IoT sensors were deployed and 18 UAVs patrolled per square kilometer of simulated forest. The system could detect fires covering 2.5 km2 with greater than 99 percent accuracy. For smaller fires covering 0.5 km2 the approach yielded 69 percent accuracy.

These results suggest that, if an optimal number of UAVs and IoT devices are present, wildfires can be detected in much shorter time than with the satellite imaging. But Bushnaq acknowledges that this approach has its limitations. “UAV-IoT networks can only cover relatively smaller areas,” he explains. “Therefore, the UAV-IoT network would be particularly suitable for wildfire detection at high-risk regions.”

For these reasons, the researchers are proposing that UAV-IoT approach be used alongside satellite imaging, which can cover vast areas but with less wildfire detection speed and reliability.

Moving forward, the team plans to explore ways of further improving upon this approach, for example by optimizing the trajectory of the UAVs or addressing issues related to the battery life of UAVs.

Bushnaq envisions such UAV-IoT systems having much broader applications, too. “Although the system is designed for wildfire detection, it can be used for monitoring different forest parameters, such as wind speed, moisture content, or temperature estimation,” he says, noting that such a system could also be extended beyond the forest setting, for example by monitoring oil spills in bodies of water. Continue reading

Posted in Human Robots

#439073 There’s a ‘New’ Nirvana Song Out, ...

One of the primary capabilities separating human intelligence from artificial intelligence is our ability to be creative—to use nothing but the world around us, our experiences, and our brains to create art. At present, AI needs to be extensively trained on human-made works of art in order to produce new work, so we’ve still got a leg up. That said, neural networks like OpenAI’s GPT-3 and Russian designer Nikolay Ironov have been able to create content indistinguishable from human-made work.

Now there’s another example of AI artistry that’s hard to tell apart from the real thing, and it’s sure to excite 90s alternative rock fans the world over: a brand-new, never-heard-before Nirvana song. Or, more accurately, a song written by a neural network that was trained on Nirvana’s music.

The song is called “Drowned in the Sun,” and it does have a pretty Nirvana-esque ring to it. The neural network that wrote it is Magenta, which was launched by Google in 2016 with the goal of training machines to create art—or as the tool’s website puts it, exploring the role of machine learning as a tool in the creative process. Magenta was built using TensorFlow, Google’s massive open-source software library focused on deep learning applications.

The song was written as part of an album called Lost Tapes of the 27 Club, a project carried out by a Toronto-based organization called Over the Bridge focused on mental health in the music industry.

Here’s how a computer was able to write a song in the unique style of a deceased musician. Music, 20 to 30 tracks, was fed into Magenta’s neural network in the form of MIDI files. MIDI stands for Musical Instrument Digital Interface, and the format contains the details of a song written in code that represents musical parameters like pitch and tempo. Components of each song, like vocal melody or rhythm guitar, were fed in one at a time.

The neural network found patterns in these different components, and got enough of a handle on them that when given a few notes to start from, it could use those patterns to predict what would come next; in this case, chords and melodies that sound like they could’ve been written by Kurt Cobain.

To be clear, Magenta didn’t spit out a ready-to-go song complete with lyrics. The AI wrote the music, but a different neural network wrote the lyrics (using essentially the same process as Magenta), and the team then sifted through “pages and pages” of output to find lyrics that fit the melodies Magenta created.

Eric Hogan, a singer for a Nirvana tribute band who the Over the Bridge team hired to sing “Drowned in the Sun,” felt that the lyrics were spot-on. “The song is saying, ‘I’m a weirdo, but I like it,’” he said. “That is total Kurt Cobain right there. The sentiment is exactly what he would have said.”

Cobain isn’t the only musician the Lost Tapes project tried to emulate; songs in the styles of Jimi Hendrix, Jim Morrison, and Amy Winehouse were also included. What all these artists have in common is that they died by suicide at the age of 27.

The project is meant to raise awareness around mental health, particularly among music industry professionals. It’s not hard to think of great artists of all persuasions—musicians, painters, writers, actors—whose lives are cut short due to severe depression and other mental health issues for which it can be hard to get help. These issues are sometimes romanticized, as suffering does tend to create art that’s meaningful, relatable, and timeless. But according to the Lost Tapes website, suicide attempts among music industry workers are more than double that of the general population.

How many more hit songs would these artists have written if they were still alive? We’ll never know, but hopefully Lost Tapes of the 27 Club and projects like it will raise awareness of mental health issues, both in the music industry and in general, and help people in need find the right resources. Because no matter how good computers eventually get at creating music, writing, or other art, as Lost Tapes’ website pointedly says, “Even AI will never replace the real thing.”

Image Credit: Edward Xu on Unsplash Continue reading

Posted in Human Robots