Tag Archives: Off

#435621 ANYbotics Introduces Sleek New ANYmal C ...

Quadrupedal robots are making significant advances lately, and just in the past few months we’ve seen Boston Dynamics’ Spot hauling a truck, IIT’s HyQReal pulling a plane, MIT’s MiniCheetah doing backflips, Unitree Robotics’ Laikago towing a van, and Ghost Robotics’ Vision 60 exploring a mine. Robot makers are betting that their four-legged machines will prove useful in a variety of applications in construction, security, delivery, and even at home.

ANYbotics has been working on such applications for years, testing out their ANYmal robot in places where humans typically don’t want to go (like offshore platforms) as well as places where humans really don’t want to go (like sewers), and they have a better idea than most companies what can make quadruped robots successful.

This week, ANYbotics is announcing a completely new quadruped platform, ANYmal C, a major upgrade from the really quite research-y ANYmal B. The new quadruped has been optimized for ruggedness and reliability in industrial environments, with a streamlined body painted a color that lets you know it means business.

ANYmal C’s physical specs are pretty impressive for a production quadruped. It can move at 1 meter per second, manage 20-degree slopes and 45-degree stairs, cross 25-centimeter gaps, and squeeze through passages just 60 centimeters wide. It’s packed with cameras and 3D sensors, including a lidar for 3D mapping and simultaneous localization and mapping (SLAM). All these sensors (along with the vast volume of gait research that’s been done with ANYmal) make this one of the most reliably autonomous quadrupeds out there, with real-time motion planning and obstacle avoidance.

Image: ANYbotics

ANYmal can autonomously attach itself to a cone-shaped docking station to recharge.

ANYmal C is also one of the ruggedest legged robots in existence. The 50-kilogram robot is IP67 rated, meaning that it’s completely impervious to dust and can withstand being submerged in a meter of water for an hour. If it’s submerged for longer than that, you’re absolutely doing something wrong. The robot will run for over 2 hours on battery power, and if that’s not enough endurance, don’t worry, because ANYmal can autonomously impale itself on a weird cone-shaped docking station to recharge.

Photo: ANYbotics

ANYmal C’s sensor payload includes cameras and a lidar for 3D mapping and SLAM.

As far as what ANYmal C is designed to actually do, it’s mostly remote inspection tasks where you need to move around through a relatively complex environment, but where for whatever reason you’d be better off not sending a human. ANYmal C has a sensor payload that gives it lots of visual options, like thermal imaging, and with the ability to handle a 10-kilogram payload, the robot can be adapted to many different environments.

Over the next few months, we’re hoping to see more examples of ANYmal C being deployed to do useful stuff in real-world environments, but for now, we do have a bit more detail from ANYbotics CTO Christian Gehring.

IEEE Spectrum: Can you tell us about the development process for ANYmal C?

Christian Gehring: We tested the previous generation of ANYmal (B) in a broad range of environments over the last few years and gained a lot of insights. Based on our learnings, it became clear that we would have to re-design the robot to meet the requirements of industrial customers in terms of safety, quality, reliability, and lifetime. There were different prototype stages both for the new drives and for single robot assemblies. Apart from electrical tests, we thoroughly tested the thermal control and ingress protection of various subsystems like the depth cameras and actuators.

What can ANYmal C do that the previous version of ANYmal can’t?

ANYmal C was redesigned with a focus on performance increase regarding actuation (new drives), computational power (new hexacore Intel i7 PCs), locomotion and navigation skills, and autonomy (new depth cameras). The new robot additionally features a docking system for autonomous recharging and an inspection payload as an option. The design of ANYmal C is far more integrated than its predecessor, which increases both performance and reliability.

How much of ANYmal C’s development and design was driven by your experience with commercial or industry customers?

Tests (such as the offshore installation with TenneT) and discussions with industry customers were important to get the necessary design input in terms of performance, safety, quality, reliability, and lifetime. Most customers ask for very similar inspection tasks that can be performed with our standard inspection payload and the required software packages. Some are looking for a robot that can also solve some simple manipulation tasks like pushing a button. Overall, most use cases customers have in mind are realistic and achievable, but some are really tough for the robot, like climbing 50° stairs in hot environments of 50°C.

Can you describe how much autonomy you expect ANYmal C to have in industrial or commercial operations?

ANYmal C is primarily developed to perform autonomous routine inspections in industrial environments. This autonomy especially adds value for operations that are difficult to access, as human operation is extremely costly. The robot can naturally also be operated via a remote control and we are working on long-distance remote operation as well.

Do you expect that researchers will be interested in ANYmal C? What research applications could it be useful for?

ANYmal C has been designed to also address the needs of the research community. The robot comes with two powerful hexacore Intel i7 computers and can additionally be equipped with an NVIDIA Jetson Xavier graphics card for learning-based applications. Payload interfaces enable users to easily install and test new sensors. By joining our established ANYmal Research community, researchers get access to simulation tools and software APIs, which boosts their research in various areas like control, machine learning, and navigation.

[ ANYmal C ] Continue reading

Posted in Human Robots

#435619 Video Friday: Watch This Robot Dog ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
RoboBusiness 2019 – October 1-3, 2019 – Santa Clara, CA, USA
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

Team PLUTO (University of Pennsylvania, Ghost Robotics, and Exyn Technologies) put together this video giving us a robot’s-eye-view (or whatever they happen to be using for eyes) of the DARPA Subterranean Challenge tunnel circuits.

[ PLUTO ]

Zhifeng Huang has been improving his jet-stepping humanoid robot, which features new hardware and the ability to take larger and more complex steps.

This video reported the last progress of an ongoing project utilizing ducted-fan propulsion system to improve humanoid robot’s ability in stepping over large ditches. The landing point of the robot’s swing foot can be not only forward but also side direction. With keeping quasi-static balance, the robot was able to step over a ditch with 450mm in width (up to 97% of the robot’s leg’s length) in 3D stepping.

[ Paper ]

Thanks Zhifeng!

These underacuated hands from Matei Ciocarlie’s lab at Columbia are magically able to reconfigure themselves to grasp different object types with just one or two motors.

[ Paper ] via [ ROAM Lab ]

This is one reason we should pursue not “autonomous cars” but “fully autonomous cars” that never require humans to take over. We can’t be trusted.

During our early days as the Google self-driving car project, we invited some employees to test our vehicles on their commutes and weekend trips. What we were testing at the time was similar to the highway driver assist features that are now available on cars today, where the car takes over the boring parts of the driving, but if something outside its ability occurs, the driver has to take over immediately.

What we saw was that our testers put too much trust in that technology. They were doing things like texting, applying makeup, and even falling asleep that made it clear they would not be ready to take over driving if the vehicle asked them to. This is why we believe that nothing short of full autonomy will do.

[ Waymo ]

Buddy is a DIY and fetchingly minimalist social robot (of sorts) that will be coming to Kickstarter this month.

We have created a new arduino kit. His name is Buddy. He is a DIY social robot to serve as a replacement for Jibo, Cozmo, or any of the other bots that are no longer available. Fully 3D printed and supported he adds much more to our series of Arduino STEM robotics kits.

Buddy is able to look around and map his surroundings and react to changes within them. He can be surprised and he will always have a unique reaction to changes. The kit can be built very easily in less than an hour. It is even robust enough to take the abuse that kids can give it in a classroom.

[ Littlebots ]

The android Mindar, based on the Buddhist deity of mercy, preaches sermons at Kodaiji temple in Kyoto, and its human colleagues predict that with artificial intelligence it could one day acquire unlimited wisdom. Developed at a cost of almost $1 million (¥106 million) in a joint project between the Zen temple and robotics professor Hiroshi Ishiguro, the robot teaches about compassion and the dangers of desire, anger and ego.

[ Japan Times ]

I’m not sure whether it’s the sound or what, but this thing scares me for some reason.

[ BIRL ]

This gripper uses magnets as a sort of adjustable spring for dynamic stiffness control, which seems pretty clever.

[ Buffalo ]

What a package of medicine sees while being flown by drone from a hospital to a remote clinic in the Dominican Republic. The drone flew 11 km horizontally and 800 meters vertically, and I can’t even imagine what it would take to make that drive.

[ WeRobotics ]

My first ride in a fully autonomous car was at Stanford in 2009. I vividly remember getting in the back seat of a descendant of Junior, and watching the steering wheel turn by itself as the car executed a perfect parking maneuver. Ten years later, it’s still fun to watch other people have that experience.

[ Waymo ]

Flirtey, the pioneer of the commercial drone delivery industry, has unveiled the much-anticipated first video of its next-generation delivery drone, the Flirtey Eagle. The aircraft designer and manufacturer also unveiled the Flirtey Portal, a sophisticated take off and landing platform that enables scalable store-to-door operations; and an autonomous software platform that enables drones to deliver safely to homes.

[ Flirtey ]

EPFL scientists are developing new approaches for improved control of robotic hands – in particular for amputees – that combines individual finger control and automation for improved grasping and manipulation. This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects.

[ EPFL ]

This video is a few years old, but we’ll take any excuse to watch the majestic sage-grouse be majestic in all their majesticness.

[ UC Davis ]

I like the idea of a game of soccer (or, football to you weirdos in the rest of the world) where the ball has a mind of its own.

[ Sphero ]

Looks like the whole delivery glider idea is really taking off! Or, you know, not taking off.

Weird that they didn’t show the landing, because it sure looked like it was going to plow into the side of the hill at full speed.

[ Yates ] via [ sUAS News ]

This video is from a 2018 paper, but it’s not like we ever get tired of seeing quadrupeds do stuff, right?

[ MIT ]

Founder and Head of Product, Ian Bernstein, and Head of Engineering, Morgan Bell, have been involved in the Misty project for years and they have learned a thing or two about building robots. Hear how and why Misty evolved into a robot development platform, learn what some of the earliest prototypes did (and why they didn’t work for what we envision), and take a deep dive into the technology decisions that form the Misty II platform.

[ Misty Robotics ]

Lex Fridman interviews Vijay Kumar on the Artifiical Intelligence Podcast.

[ AI Podcast ]

This week’s CMU RI Seminar is from Ross Knepper at Cornell, on Formalizing Teamwork in Human-Robot Interaction.

Robots out in the world today work for people but not with people. Before robots can work closely with ordinary people as part of a human-robot team in a home or office setting, robots need the ability to acquire a new mix of functional and social skills. Working with people requires a shared understanding of the task, capabilities, intentions, and background knowledge. For robots to act jointly as part of a team with people, they must engage in collaborative planning, which involves forming a consensus through an exchange of information about goals, capabilities, and partial plans. Often, much of this information is conveyed through implicit communication. In this talk, I formalize components of teamwork involving collaboration, communication, and representation. I illustrate how these concepts interact in the application of social navigation, which I argue is a first-class example of teamwork. In this setting, participants must avoid collision by legibly conveying intended passing sides via nonverbal cues like path shape. A topological representation using the braid groups enables the robot to reason about a small enumerable set of passing outcomes. I show how implicit communication of topological group plans achieves rapid covergence to a group consensus, and how a robot in the group can deliberately influence the ultimate outcome to maximize joint performance, yielding pedestrian comfort with the robot.

[ CMU RI ]

In this week’s episode of Robots in Depth, Per speaks with Julien Bourgeois about Claytronics, a project from Carnegie Mellon and Intel to develop “programmable matter.”

Julien started out as a computer scientist. He was always interested in robotics privately but then had the opportunity to get into micro robots when his lab was merged into the FEMTO-ST Institute. He later worked with Seth Copen Goldstein at Carnegie Mellon on the Claytronics project.

Julien shows an enlarged mock-up of the small robots that make up programmable matter, catoms, and speaks about how they are designed. Currently he is working on a unit that is one centimeter in diameter and he shows us the very small CPU that goes into that model.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435616 Video Friday: AlienGo Quadruped Robot ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

CLAWAR 2019 – August 26-28, 2019 – Kuala Lumpur, Malaysia
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

I know you’ve all been closely following our DARPA Subterranean Challenge coverage here and on Twitter, but here are short recap videos of each day just in case you missed something.

[ DARPA SubT ]

After Laikago, Unitree Robotics is now introducing AlienGo, which is looking mighty spry:

We’ve seen MIT’s Mini Cheetah doing backflips earlier this year, but apparently AlienGo is now the largest and heaviest quadruped to perform the maneuver.

[ Unitree ]

The majority of soft robots today rely on external power and control, keeping them tethered to off-board systems or rigged with hard components. Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Caltech have developed soft robotic systems, inspired by origami, that can move and change shape in response to external stimuli, paving the way for fully untethered soft robots.

The Rollbot begins as a flat sheet, about 8 centimeters long and 4 centimeters wide. When placed on a hot surface, about 200°C, one set of hinges folds and the robot curls into a pentagonal wheel.

Another set of hinges is embedded on each of the five sides of the wheel. A hinge folds when in contact with the hot surface, propelling the wheel to turn to the next side, where the next hinge folds. As they roll off the hot surface, the hinges unfold and are ready for the next cycle.

[ Harvard SEAS ]

A new research effort at Caltech aims to help people walk again by combining exoskeletons with spinal stimulation. This initiative, dubbed RoAM (Robotic Assisted Mobility), combines the research of two Caltech roboticists: Aaron Ames, who creates the algorithms that enable walking by bipedal robots and translates these to govern the motion of exoskeletons and prostheses; and Joel Burdick, whose transcutaneous spinal implants have already helped paraplegics in clinical trials to recover some leg function and, crucially, torso control.

[ Caltech ]

Once ExoMars lands, it’s going to have to get itself off of the descent stage and onto the surface, which could be tricky. But practice makes perfect, or as near as you can get on Earth.

That wheel walking technique is pretty cool, and it looks like ExoMars will be able to handle terrain that would scare NASA’s Mars rovers away.

[ ExoMars ]

I am honestly not sure whether this would make the game of golf more or less fun to watch:

[ Nissan ]

Finally, a really exciting use case for Misty!

It can pick up those balls too, right?

[ Misty ]

You know you’re an actual robot if this video doesn’t make you crave Peeps.

[ Soft Robotics ]

COMANOID investigates the deployment of robotic solutions in well-identified Airbus airliner assembly operations that are tedious for human workers and for which access is impossible for wheeled or rail-ported robotic platforms. This video presents a demonstration of autonomous placement of a part inside the aircraft fuselage. The task is performed by TORO, the torque-controlled humanoid robot developed at DLR.

[ COMANOID ]

It’s a little hard to see in this video, but this is a cable-suspended robot arm that has little tiny robot arms that it waves around to help damp down vibrations.

[ CoGiRo ]

This week in Robots in Depth, Per speaks with author Cristina Andersson.

In 2013 she organized events in Finland during European robotics week and found that many people was very interested but that there was also a big lack of knowledge.

She also talks about introducing robotics in society in a way that makes it easy for everyone to understand the benefits as this will make the process much easier. When people see the clear benefits in one field or situation they will be much more interested in bringing robotics in to their private or professional lives.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435605 All of the Winners in the DARPA ...

The first competitive event in the DARPA Subterranean Challenge concluded last week—hopefully you were able to follow along on the livestream, on Twitter, or with some of the articles that we’ve posted about the event. We’ll have plenty more to say about how things went for the SubT teams, but while they take a bit of a (well earned) rest, we can take a look at the winning teams as well as who won DARPA’s special superlative awards for the competition.

First Place: Team Explorer (25/40 artifacts found)
With their rugged, reliable robots featuring giant wheels and the ability to drop communications nodes, Team Explorer was in the lead from day 1, scoring in double digits on every single run.

Second Place: Team CoSTAR (11/40 artifacts found)
Team CoSTAR had one of the more diverse lineups of robots, and they switched up which robots they decided to send into the mine as they learned more about the course.

Third Place: Team CTU-CRAS (10/40 artifacts found)
While many teams came to SubT with DARPA funding, Team CTU-CRAS was self-funded, making them eligible for a special $200,000 Tunnel Circuit prize.

DARPA also awarded a bunch of “superlative awards” after SubT:

Most Accurate Artifact: Team Explorer

To score a point, teams had to submit the location of an artifact that was correct to within 5 meters of the artifact itself. However, DARPA was tracking the artifact locations with much higher precision—for example, the “zero” point on the backpack artifact was the center of the label on the front, which DARPA tracked to the millimeter. Team Explorer managed to return the location of a backpack with an error of just 0.18 meter, which is kind of amazing.

Down to the Wire: Team CSIRO Data61

With just an hour to find as many artifacts as possible, teams had to find the right balance between sending robots off to explore and bringing them back into communication range to download artifact locations. Team CSIRO Data61 cut their last point pretty close, sliding their final point in with a mere 22 seconds to spare.

Most Distinctive Robots: Team Robotika

Team Robotika had some of the quirkiest and most recognizable robots, which DARPA recognized with the “Most Distinctive” award. Robotika told us that part of the reason for that distinctiveness was practical—having a robot that was effectively in two parts meant that they could disassemble it so that it would fit in the baggage compartment of an airplane, very important for a team based in the Czech Republic.

Most Robots Per Person: Team Coordinated Robotics

Kevin Knoedler, who won NASA’s Space Robotics Challenge entirely by himself, brought his own personal swarm of drones to SubT. With a ratio of seven robots to one human, Kevin was almost certainly the hardest working single human at the challenge.

Fan Favorite: Team NCTU

Photo: Evan Ackerman/IEEE Spectrum

The Fan Favorite award went to the team that was most popular on Twitter (with the #SubTChallenge hashtag), and it may or may not be the case that I personally tweeted enough about Team NCTU’s blimp to win them this award. It’s also true that whenever we asked anyone on other teams what their favorite robot was (besides their own, of course), the blimp was overwhelmingly popular. So either way, the award is well deserved.

DARPA shared this little behind-the-scenes clip of the blimp in action (sort of), showing what happened to the poor thing when the mine ventilation system was turned on between runs and DARPA staff had to chase it down and rescue it:

The thing to keep in mind about the results of the Tunnel Circuit is that unlike past DARPA robotics challenges (like the DRC), they don’t necessarily indicate how things are going to go for the Urban or Cave circuits because of how different things are going to be. Explorer did a great job with a team of rugged wheeled vehicles, which turned out to be ideal for navigating through mines, but they’re likely going to need to change things up substantially for the rest of the challenges, where the terrain will be much more complex.

DARPA hasn’t provided any details on the location of the Urban Circuit yet; all we know is that it’ll be sometime in February 2020. This gives teams just six months to take all the lessons that they learned from the Tunnel Circuit and update their hardware, software, and strategies. What were those lessons, and what do teams plan to do differently next year? Check back next week, and we’ll tell you.

[ DARPA SubT ] Continue reading

Posted in Human Robots

#435597 Water Jet Powered Drone Takes Off With ...

At ICRA 2015, the Aerial Robotics Lab at the Imperial College London presented a concept for a multimodal flying swimming robot called AquaMAV. The really difficult thing about a flying and swimming robot isn’t so much the transition from the first to the second, since you can manage that even if your robot is completely dead (thanks to gravity), but rather the other way: going from water to air, ideally in a stable and repetitive way. The AquaMAV concept solved this by basically just applying as much concentrated power as possible to the problem, using a jet thruster to hurl the robot out of the water with quite a bit of velocity to spare.

In a paper appearing in Science Robotics this week, the roboticists behind AquaMAV present a fully operational robot that uses a solid-fuel powered chemical reaction to generate an explosion that powers the robot into the air.

The 2015 version of AquaMAV, which was mostly just some very vintage-looking computer renderings and a little bit of hardware, used a small cylinder of CO2 to power its water jet thruster. This worked pretty well, but the mass and complexity of the storage and release mechanism for the compressed gas wasn’t all that practical for a flying robot designed for long-term autonomy. It’s a familiar challenge, especially for pneumatically powered soft robots—how do you efficiently generate gas on-demand, especially if you need a lot of pressure all at once?

An explosion propels the drone out of the water
There’s one obvious way of generating large amounts of pressurized gas all at once, and that’s explosions. We’ve seen robots use explosive thrust for mobility before, at a variety of scales, and it’s very effective as long as you can both properly harness the explosion and generate the fuel with a minimum of fuss, and this latest version of AquaMAV manages to do both:

The water jet coming out the back of this robot aircraft is being propelled by a gas explosion. The gas comes from the reaction between a little bit of calcium carbide powder stored inside the robot, and water. Water is mixed with the powder one drop at a time, producing acetylene gas, which gets piped into a combustion chamber along with air and water. When ignited, the acetylene air mixture explodes, forcing the water out of the combustion chamber and providing up to 51 N of thrust, which is enough to launch the 160-gram robot 26 meters up and over the water at 11 m/s. It takes just 50 mg of calcium carbide (mixed with 3 drops of water) to generate enough acetylene for each explosion, and both air and water are of course readily available. With 0.2 g of calcium carbide powder on board, the robot has enough fuel for multiple jumps, and the jump is powerful enough that the robot can get airborne even under fairly aggressive sea conditions.

Image: Science Robotics

The robot can transition from a floating state to an airborne jetting phase and back to floating (A). A 3D model render of the underside of the robot (B) shows the electronics capsule. The capsule contains the fuel tank (C), where calcium carbide reacts with air and water to propel the vehicle.

Next step: getting the robot to fly autonomously
Providing adequate thrust is just one problem that needs to be solved when attempting to conquer the water-air transition with a fixed-wing robot. The overall design of the robot itself is a challenge as well, because the optimal design and balance for the robot is quite different in each phase of operation, as the paper describes:

For the vehicle to fly in a stable manner during the jetting phase, the center of mass must be a significant distance in front of the center of pressure of the vehicle. However, to maintain a stable floating position on the water surface and the desired angle during jetting, the center of mass must be located behind the center of buoyancy. For the gliding phase, a fine balance between the center of mass and the center of pressure must be struck to achieve static longitudinal flight stability passively. During gliding, the center of mass should be slightly forward from the wing’s center of pressure.

The current version is mostly optimized for the jetting phase of flight, and doesn’t have any active flight control surfaces yet, but the researchers are optimistic that if they added some they’d have no problem getting the robot to fly autonomously. It’s just a glider at the moment, but a low-power propeller is the obvious step after that, and to get really fancy, a switchable gearbox could enable efficient movement on water as well as in the air. Long-term, the idea is that robots like these would be useful for tasks like autonomous water sampling over large areas, but I’d personally be satisfied with a remote controlled version that I could take to the beach.

“Consecutive aquatic jump-gliding with water-reactive fuel,” by R. Zufferey, A. Ortega Ancel, A. Farinha, R. Siddall, S. F. Armanini, M. Nasr, R. V. Brahmal, G. Kennedy, and M. Kovac from Imperial College in London, is published in the current issue of Science Robotics. Continue reading

Posted in Human Robots