Category Archives: Human Robots

Everything about Humanoid Robots and Androids

#439555 Unitree’s Go1 Robot Dog Looks Pretty ...

In 2017, we first wrote about the Chinese startup Unitree Robotics, which had the goal of “making legged robots as popular and affordable as smartphones and drones.” Relative to the cost of other quadrupedal robots (like Boston Dynamics’ $74,000 Spot), Unitree’s quadrupeds are very affordable, with their A1 costing under $10,000 when it became available in 2020. This hasn’t quite reached the point of consumer electronics that Unitree is aiming for, but they’ve just gotten a lot closer: now available is the Unitree Go1, a totally decent looking small size quadruped that can be yours for an astonishingly low $2700.

Not bad, right? Speedy, good looking gait, robust, and a nifty combination of autonomous human following and obstacle avoidance. As with any product video, it’s important to take everything you see here with a grain of salt, but based on Unitree’s track record we have no particular reason to suspect that there’s much in the way of video trickery going on.

There are three versions of the Go1: the $2700 base model Go1 Air, the $3500 Go1, and the $8500 Go1 Edu. This looks to be the sort of Goldilocks pricing model, where most people are likely to spring for the middle version Go1, which includes better sensing and compute as well as 50% more battery life an an extra m/s of speed (up to 3.5m/s) for a modest premium in cost. The top of the line Edu model offers higher end computing, 2kg more payload (up to 5kg), as well as foot-force sensors, lidar, and a hardware extension interface and API access. More detailed specs are here, although if you’re someone who actually cares about detailed robot specs, what you’ll find on Unitree’s website at the moment will probably be a little bit disappointing.

We’ve reached out to Unitree to ask them about some of the specs that aren’t directly addressed on the website. Battery life is a big question—the video seems to suggest that the Go1 is capable of a three-kilometer, 20-minute jog, and then some grocery shopping and a picnic, all while doing obstacle avoidance and person following and with an occasional payload. If all of that is without any battery swaps, that’s pretty good. We’re also wondering exactly what the “Super Sensory System” is, what kinds of tracking and obstacle avoidance and map making skills the Go1 has, and exactly what capabilities you’ll be required to spring for the fancier (and more expensive) versions of the Go1 to enjoy.

Honestly, though, we’re not sure what Unitree could realistically tell us about the Go1 where we’d be like, “hmm okay maybe this isn’t that great of a deal after all.” Of course the real test will be when some non-Unitree folks get a hold of a Go1 to see what it can actually do (Unitree, please contact me for my mailing address), but even at $3500 for the midrange model, this seems like an impressively cost effective little robot.

Update: we contacted Unitree for more details, and they’ve also updated the Go1 website to include the following:

The battery life of the robot while jogging is about 1 hour
It weighs 12kg
The Super Sensory System includes five wide-angle stereo depth cameras, hypersonic distance sensors, and an integrated processing system
It’s running at 16 core CPU and a 1.5 tflop GPU

We also asked Wang Xingxing, Unitree’s CEO, about how they were able to make Go1 so affordable, and here’s what he told us:

Unitree Go1 can be regarded as a product that we have achieved after 6-7 years of iteration at the hardware level, only to achieve the goals of ultra-low cost, high reliability and high performance. Our company actually spent more manpower and money than software on the hardware level such as machinery. Continue reading

Posted in Human Robots

#439551 Video Friday: Drone Refueling

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboCup 2021 – June 22-28, 2021 – [Online Event]
RSS 2021 – July 12-16, 2021 – [Online Event]
Humanoids 2020 – July 19-21, 2021 – [Online Event]
RO-MAN 2021 – August 8-12, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 2021 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

The MQ-25 T1 test asset has flown into the history books as the first unmanned aircraft to ever refuel another aircraft—piloted or autonomous—during flight.

[ Boeing ]

WomBot is an exploratory robot for monitoring wombat burrows, and the press release for it included this rather disappointing video of WomBot discovering a wombat in its burrow.

Apparently that’s what the butt of a dirt-covered wombat looks like. Here is a much more exciting video of an entirely different wombat burrow exploring robot where you get the wombat payoff that you deserve:

[ Paper ]

During the dark of night, using LiDAR for eyes, Cassie Blue is operating fully autonomously on the University of Michigan Wave Field. The terrain is challenging and was not pre-mapped.

For more on what they've been up to over at the University of Michigan, here’s a talk from them at the ICRA 2021 Workshop on Legged Robots.

[ Michigan Robotics ]

Thanks, Jessy!

The new Genesis LiveDrive LDD 1800 Series is a new high-torque direct-drive actuator. No gearbox!

[ Genesis ]

This Counter-Unmanned Air System (C-UAS) from DARPA’s Mobile Force Protection (MFP) program may look like it shot out a net and missed, but it was actually firing a bunch of sticky streamers that tangle up motors and whatnot. Festive and crashy!

[ Genesis ]

Learn about this year’s Kuka Innovation Award from some of the teams and judges, some of whom need a haircut more badly than others.

[ KUKA ]

20th Century Studios and Locksmith Animation’s “Ron’s Gone Wrong” is the story of Barney, a socially awkward middle-schooler and Ron, his new walking, talking, digitally-connected device, which is supposed to be his “Best Friend out of the Box.”

For a robot unboxing, that’s actually pretty good. Like, it arrives with a charged battery!

[ EW ]

The robot will serve you now! And it will do so without making a huge mess, thanks to folks from the University of Naples Federico II in Italy.

[ Paper ]

Thanks, Mario!

Over the past year ABB has committed to supporting diversity and inclusion amongst all of our team members, partners and suppliers. To kick off our celebration of Pride Month, Yumi put on the pride flag to show ABB’s commitment to the LGBTQ+ community.

[ ABB ]

How it’s made: surgical masks.

[ Genik ]

Meet Hera, our very own asteroid detective. Together with two CubeSats—Milani the rock decoder and Juventas the radar visionary—Hera is off on an adventure to explore Didymos, a double asteroid system that is typical of the thousands that pose an impact risk to planet Earth.

[ ESA ]

The goal of the EU-funded project ADIR was to demonstrate the feasibility of a key technology for next generation urban mining. Specifically, the project investigated the automated disassembly of electronic equipment to separate and recover valuable materials.

[ ADIR ]

NASA’s Resilient Autonomy activity is developing autonomous software for potential use in aircraft ranging from general aviation retrofit to future autonomous aircraft. This simulator footage shows iGCAS, or improved GCAS, save a small aircraft from diving into a canyon, into the side of a mountain, or into the ground.

[ NASA ]

Mess with the Cocobo security robot at your peril.

[ Impress ]

I thought the whole point of growing rice in flooded fields was that you avoided weed problems, but I guess there are enough semi-aquatic weeds that it can be handy to have a little robot boat that drives around stirring up mud to surpress weed growth.

[ Robotstart ]

We present experimental work on traversing steep, granular slopes with the dynamically walking quadrupedal robot SpaceBok. We validate static and dynamic locomotion with two different foot types (point foot and passive-adaptive planar foot) on Mars analog slopes of up to 25°(the maximum of the testbed).

[ Paper ]

You'll have to suffer through a little bit of German for this one, but you'll be rewarded with a pretty slick flying wing at the end.

[ BFW ]

Thanks, Fan!

Have you ever wondered whether the individual success metrics prevalent in robotics create perverse incentives that harm the long-term needs of the field? Or if the development of high-stakes autonomous systems warrants taking significant risks with real-world deployment to accelerate progress? Are the standards for experimental validation insufficient to ensure that published robotics methods work in the real world? We have all the answers!

[ Robotics Debates ] Continue reading

Posted in Human Robots

#439543 How Robots Helped Out After the Surfside ...

Editor's Note: Along with Robin Murphy, the authors of this article include David Merrick, Justin Adams, Jarrett Broder, Austin Bush, Laura Hart, and Rayne Hawkins. This team is with the Florida State University's Disaster Incident Response Team, which was in Surfside for 24 days at the request of Florida US&R Task 1 (Miami Dade Fire Rescue Department).

On June 24, 2021, at 1:25AM portions of the 12 story Champlain Towers South condominium in Surfside, Florida collapsed, killing 98 people and injuring 11, making it the third largest fatal collapse in US history. The life-saving and mitigation Response Phase, the phase where responders from local, state, and federal agencies searched for survivors, spanned June 24 to July 7, 2021. This article summarizes what is known about the use of robots at Champlain Towers South, and offers insights into challenges for unmanned systems.

Small unmanned aerial systems (drones) were used immediately upon arrival by the Miami Dade Fire Rescue (MDFR) Department to survey the roughly 2.68 acre affected area. Drones, such as the DJI Mavic Enterprise Dual with a spotlight payload and thermal imaging, flew in the dark to determine the scope of the collapse and search for survivors. Regional and state emergency management drone teams were requested later that day to supplement the effort of flying day and night for tactical life-saving operations and to add flights for strategic operations to support managing the overall response.

View of a Phantom 4 Pro in use for mapping the collapse on July 2, 2021. Two other drones were also in the airspace conducting other missions but not visible. Photo: Robin R. Murphy
The teams brought at least 9 models of rotorcraft drones, including the DJI Mavic 2 Enterprise Dual, Mavic 2 Enterprise Advanced, DJI Mavic 2 Zoom, DJI Mavic Mini, DJI Phantom 4 Pro, DJI Matrice 210, Autel Dragonfish, and Autel EVO II Pro plus a tethered Fotokite drone. The picture above shows a DJI Phantom 4 Pro in use, with one of the multiple cranes in use on the site visible. The number of flights for tactical operations were not recorded, but drones were flown for 304 missions for strategic operations alone, making the Surfside collapse the largest and longest use of drones recorded for a disaster, exceeding the records set by Hurricane Harvey (112) and Hurricane Florence (260).

Unmanned ground bomb squad robots were reportedly used on at least two occasions in the standing portion of the structure during the response, once to investigate and document the garage and once on July 9 to hold a repeater for a drone flying in the standing portion of the garage. Note that details about the ground robots are not yet available and there may have been more missions, though not on the order of magnitude of the drone use. Bomb squad robots tend to be too large for use in areas other than the standing portions of the collapse.

We concentrate on the use of the drones for tactical and strategic operations, as the authors were directly involved in those operations. It offers a preliminary analysis of the lessons learned. The full details of the response will not be available for many months due to the nature of an active investigation into the causes of the collapse and due to privacy of the victims and their families.
Drone Use for Tactical Operations
Tactical operations were carried out primarily by MDFR with other drone teams supporting when necessary to meet the workload. Drones were first used by the MDFR drone team, which arrived within minutes of the collapse as part of the escalating calls. The drone effort started with night operations for direct life-saving and mitigation activities. Small DJI Mavic 2 Enterprise Dual drones with thermal camera and spotlight payloads were used for general situation awareness to help responders understand the extent of the collapse beyond what could be seen from the street side. The built-in thermal imager was used but did not have the resolution and was unable to show details as much of the material was the same temperature and heat emissions were fuzzy. The spotlight with the standard visible light camera was more effective, though the view was constricted. The drones were also used to look for survivors or trapped victims, help determine safety hazards to responders, and provide task force leaders with overwatch of the responders. During daylight, DJI Mavic Zoom drones were added because of their higher camera resolution zoom. When fires started in the rubble, drones with a streaming connection to bucket truck operators were used to help optimize position of water. Drones were also used to locate civilians entering the restricted area or flying drones to taking pictures.

In a novel use of drones for physical interaction, MDFR squads flew drones to attempt to find and pick up items in the standing portion of the structure with immense value to survivors.

As the response evolved, the use of drones was expanded to missions where the drones would fly in close proximity to structures and objects, fly indoors, and physically interact with the environment. For example, drones were used to read license plates to help identify residents, search for pets, and document belongings inside parts of the standing structure for families. In a novel use of drones for physical interaction, MDFR squads flew drones to attempt to find and pick up items in the standing portion of the structure with immense value to survivors. Before the demolition of the standing portion of the tower, MDFR used a drone to remove an American flag that had been placed on the structure during the initial search.

Drone Use for Strategic Operations

An orthomosiac of the collapse constructed from imagery collected by a drone on July 1, 2021.
Strategic operations were carried out by the Disaster Incident Research Team (DIRT) from the Florida State University Center for Disaster Risk Policy. The DIRT team is a state of Florida asset and was requested by Florida Task Force 1 when it was activated to assist later on June 24. FSU supported tactical operations but was solely responsible for collecting and processing imagery for use in managing the response. This data was primarily orthomosiac maps (a single high resolution image of the collapse created from stitching together individual high resolution imagers, as in the image above) and digital elevation maps (created from structure from motion, below).

Digital elevation map constructed from imagery collected by a drone on 27 June, 2021.Photo: Robin R. Murphy
These maps were collected every two to four hours during daylight, with FSU flying an average of 15.75 missions per day for the first two weeks of the response. The latest orthomosaic maps were downloaded at the start of a shift by the tactical responders for use as base maps on their mobile devices. In addition, a 3D reconstruction of the state of the collapse on July 4 was flown the afternoon before the standing portion was demolished, shown below.

GeoCam 3D reconstruction of the collapse on July 4, 2021. Photo: Robin R. Murphy
The mapping functions are notable because they require specialized software for data collection and post-processing, plus the speed of post-processing software relied on wireless connectivity. In order to stitch and fuse images without gaps or major misalignments, dedicated software packages are used to generate flight paths and autonomously fly and trigger image capture with sufficient coverage of the collapse and overlap between images.

Coordination of Drones on Site
The aerial assets were loosely coordinated through social media. All drones teams and Federal Aviation Administration (FAA) officials shared a WhatsApp group chat managed by MDFR. WhatsApp offered ease of use, compatibility with everyone's smartphones and mobile devices, and ease of adding pilots. Ease of adding pilots was important because many were not from MDFR and thus would not be in any personnel-oriented coordination system. The pilots did not have physical meetings or briefings as a whole, though the tactical and strategic operations teams did share a common space (nicknamed “Drone Zone”) while the National Institute of Standards and Technology teams worked from a separate staging location. If a pilot was approved by MDFR drone captain who served as the “air boss,” they were invited to the WhatsApp group chat and could then begin flying immediately without physically meeting the other pilots.

The teams flew concurrently and independently without rigid, pre-specified altitude or area restrictions. One team would post that they were taking off to fly at what area of the collapse and at what altitude and then post when they landed. The easiest solution was for the pilots to be aware of each others' drones and adjust their missions, pause, or temporarily defer flights. If a pilot forgot to post, someone would send a teasing chat eliciting a rapid apology.
Incursions by civilian manned and unmanned aircraft in the restricted airspace did occur. If FAA observers or other pilots saw a drone flying that was not accounted for in the chat, i.e., that five drones were visible over the area but only four were posted, or if a drone pilot saw a drone in an unexpected area, they would post a query asking if someone had forgotten to post or update a flight. If the drone remained unaccounted for, the FAA would assume that a civilian drone had violated the temporary flight restrictions and search the surrounding area for the offending pilot.
Preliminary Lessons LearnedWhile the drone data and performance is still being analyzed, some lessons learned have emerged that may be of value to the robotics, AI, and engineering communities.
Tactical and strategic operations during the response phase favored small, inexpensive, easy to carry platforms with cameras supporting coarse structure from motion rather than larger, more expensive lidar systems. The added accuracy of lidar systems was not needed for those missions, though the greater accuracy and resolution of such systems were valuable for the forensic structural analysis. For tactical and strategic operations, the benefits of lidar was not worth the capital costs and logistical burden. Indeed, general purpose consumer/prosumer drones that could fly day or night, indoors and outdoors, and for both mapping and first person view missions were highly preferred over specialized drones. The reliability of a drone was another major factor in choosing a specific model to field, again favoring consumer/prosumer drones as they typically have hundreds of thousand hours of flight time more than specialized or novel drones. Tethered drones offer some advantages for overwatch but many tactical operations missions require a great deal of mobility. Strategic mapping necessitates flying directly over the entire area being mapped.

While small, inexpensive general purpose drones offered many advantages, they could be further improved for flying at night and indoors. A wider area of lighting would be helpful. A 360 degree (spherical) area of coverage for obstacle avoidance for working indoors or at low altitudes and close proximity to irregular work envelopes and near people, especially as night, would also be useful. Systems such as the Flyability ELIOS 2 are designed to fly in narrow and highly cluttered indoor areas, but no models were available for the immediate response. Drone camera systems need to be able to look straight up to inspect the underside of structures or ceilings. Mechanisms for determining the accurate GPS location of a pixel in an image, not just the GPS location of the drone, is becoming increasing desirable.
Other technologies could be of benefit to the enterprise but face challenges. Computer vision/machine learning (CV/ML) for searching for victims in rubble is often mentioned as a possible goal, but a search for victims who are not on the surface of the collapse is not visually directed. The portions of victims that are not covered by rubble are usually camouflaged with gray dust, so searches tend to favor canines using scent. Another challenge for CV/ML methods is the lack of access to training data. Privacy and ethical concerns poses barriers to the research community gaining access to imagery with victims in the rubble, but simulations may not have sufficient fidelity.
The collapse supplies motivation for how informatics research and human-computer interaction and human-robot interaction can contribute to the effective use of robots during a disaster, and illustrates that a response does not follow a strictly centralized, hierarchical command structure and the agencies and members of the response are not known in advance. Proposed systems must be flexible, robust, and easy to use. Furthermore, it is not clear that responders will accept a totally new software app versus making do with a general purpose app such as WhatsApp that the majority routinely use for other purposes.
The biggest lesson learned is that robots are helpful and warrant more investment, particular as many US states are proposing to terminate purchase of the very models of drones that were so effective over cybersecurity concerns.
However, the biggest lesson learned is that robots are helpful and warrant more investment, particular as many US states are proposing to terminate purchase of the very models of drones that were so effective over cybersecurity concerns. There remains much to work to be done by researchers, manufacturers, and emergency management to make these critical technologies more useful for extreme environments. Our current work is focusing on creating open source datasets and documentation and conducting a more thorough analysis to accelerate the process.

Value of Drones The pervasive use of the drones indicates their implicit value to responding to, and documenting, the disaster. It is difficult to quantify the impact of drones, similar to the difficulties in quantifying the impact of a fire truck on firefighting or the use of mobile devices in general. Simply put, drones would not have been used beyond a few flights if they were not valuable.
The impact of the drones on tactical operations was immediate, as upon arrival MDFR flew drones to assess the extent of the collapse. Lighting on fire trucks primarily illuminated the street side of the standing portion of the building, while the drones, unrestricted by streets or debris, quickly expanded situation awareness of the disaster. The drones were used optimize placement of water to suppress the fires in the debris. The impact of the use of drones for other tactical activities is harder to quantify, but the frequent flights and pilots remaining on stand-by 24/7 indicate their value.
The impact of the drones on strategic operations was also considerable. The data collected by the drones and then processed into 2D maps and 3D models became a critical part of the US&R operations as well as one part of the nascent investigation into why the building failed. During initial operations, DIRT provided 2D maps to the US&R teams four times per day. These maps became the base layers for the mobile apps used on the pile to mark the locations of human remains, structural members of the building, personal effects, or other identifiable information. Updated orthophotos were critical to the accuracy of these reports. These apps running on mobile devices suffered from GPS accuracy issues, often with errors as high as ten meters. By having base imagery that was only hours old, mobile app users where able to 'drag the pin' on the mobile app to a more accurate report location on the pile – all by visualizing where they were standing compared to fresh UAS imagery. Without this capability, none of the GPS field data would be of use to US&R or investigators looking at why the structural collapse occurred. In addition to serving a base layer on mobile applications, the updated map imagery was used in all tactical, operational, and strategic dashboards by the individual US&R teams as well as the FEMA US&R Incident Support Team (IST) on site to assist in the management of the incident.
Aside from the 2D maps and orthophotos, 3D models were created from the drone data and used by structural experts to plan operations, including identifying areas with high probabilities of finding survivors or victims. Three-dimensional data created through post-processing also supported the demand for up-to-date volumetric estimates – how much material was being removed from the pile, and how much remained. These metrics provided clear indications of progress throughout the operations.
Acknowledgments Portions of this work were supported by NSF grants IIS-1945105 and CMMI- 2140451. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
The authors express their sincere condolences to the families of the victims. Continue reading

Posted in Human Robots

#439541 A tactile sensing mechanism for soft ...

In recent years, numerous roboticists worldwide have been trying to develop robotic systems that can artificially replicate the human sense of touch. In addition, they have been trying to create increasingly realistic and advanced bionic limbs and humanoid robots, using soft materials instead of rigid structures. Continue reading

Posted in Human Robots

#439537 Tencent’s New Wheeled Robot Flicks Its ...

Ollie (I think its name is Ollie) is a “a novel wheel-legged robot” from Tencent Robotics. The word “novel” is used quite appropriately here, since Ollie sports some unusual planar parallel legs atop driven wheels. It’s also got a multifunctional actuated tail that not only enables some impressive acrobatics, but also allows the robot to transition from biped-ish to triped-ish to stand up extra tall and support a coffee-carrying manipulator.

It’s a little disappointing that the tail only appears to be engaged for specific motions—it doesn’t seem like it’s generally part of the robot’s balancing or motion planning, which feels like a missed opportunity. But this robot is relatively new, and its development is progressing rapidly, which we know because an earlier version of the hardware and software was presented at ICRA 2021 a couple weeks back. Although, to be honest with you, there isn’t a lot of info on the new one besides the above video, so we’ll be learning what we can from the ICRA paper.

The paper is mostly about developing a nonlinear balancing controller for the robot, and they’ve done a bang-up job with it, with the robot remaining steady even while executing sequences of dynamic motions. The jumping and one-legged motions are particularly cool to watch. And, well, that’s pretty much it for the ICRA paper, which (unfortunately) barely addresses the tail at all, except to say that currently the control system assumes that the tail is fixed. We’re guessing that this is just a symptom of the ICRA paper submission deadline being back in October, and that a lot of progress has been made since then.

Seeing the arm and sensor package at the end of the video is a nod to some sort of practical application, and I suppose that the robot’s ability to stand up to reach over that counter is some justification for using it for a delivery task. But it seems like it’s got so much more to offer, you know? Many far more boring platforms robots could be delivering coffee, so let’s find something for this robot to do that involves more backflips.

Balance Control of a Novel Wheel-legged Robot: Design and Experiments, by Shuai Wang, Leilei Cui, Jingfan Zhang, Jie Lai, Dongsheng Zhang, Ke Chen, Yu Zheng, Zhengyou Zhang, and Zhong-Ping Jiang from Tencent Robotics X, was presented at ICRA 2021. Continue reading

Posted in Human Robots