Tag Archives: can

#435648 Surprisingly Speedy Soft Robot Survives ...

Soft robots are getting more and more popular for some very good reasons. Their relative simplicity is one. Their relative low cost is another. And for their simplicity and low cost, they’re generally able to perform very impressively, leveraging the unique features inherent to their design and construction to move themselves and interact with their environment. The other significant reason why soft robots are so appealing is that they’re durable. Without the constraints of rigid parts, they can withstand the sort of abuse that would make any roboticist cringe.

In the current issue of Science Robotics, a group of researchers from Tsinghua University in China and University of California, Berkeley, present a new kind of soft robot that’s both higher performance and much more robust than just about anything we’ve seen before. The deceptively simple robot looks like a bent strip of paper, but it’s able to move at 20 body lengths per second and survive being stomped on by a human wearing tennis shoes. Take that, cockroaches.

This prototype robot measures just 3 centimeters by 1.5 cm. It takes a scanning electron microscope to actually see what the robot is made of—a thermoplastic layer is sandwiched by palladium-gold electrodes, bonded with adhesive silicone to a structural plastic at the bottom. When an AC voltage (as low as 8 volts but typically about 60 volts) is run through the electrodes, the thermoplastic extends and contracts, causing the robot’s back to flex and the little “foot” to shuffle. A complete step cycle takes just 50 milliseconds, yielding a 200 hertz gait. And technically, the robot “runs,” since it does have a brief aerial phase.

Image: Science Robotics

Photos from a high-speed camera show the robot’s gait (A to D) as it contracts and expands its body.

To put the robot’s top speed of 20 body lengths per second in perspective, have a look at this nifty chart, which shows where other animals relative running speeds of some animals and robots versus body mass:

Image: Science Robotics

This chart shows the relative running speeds of some mammals (purple area), arthropods (orange area), and soft robots (blue area) versus body mass. For both mammals and arthropods, relative speeds show a strong negative scaling law with respect to the body mass: speeds increase as body masses decrease. However, for soft robots, the relationship appears to be the opposite: speeds decrease as the body mass decrease. For the little soft robots created by the researchers from Tsinghua University and UC Berkeley (red stars), the scaling law is similar to that of living animals: Higher speed was attained as the body mass decreased.

If you were wondering, like we were, just what that number 39 is on that chart (top left corner), it’s a species of tiny mite that was discovered underneath a rock in California in 1916. The mite is just under 1 mm in size, but it can run at 0.8 kilometer per hour, which is 322 body lengths per second, making it by far (like, by a factor of two at least) the fastest land animal on Earth relative to size. If a human was to run that fast relative to our size, we’d be traveling at a little bit over 2,000 kilometers per hour. It’s not a coincidence that pretty much everything in the upper left of the chart is an insect—speed scales favorably with decreasing mass, since actuators have a proportionally larger effect.

Other notable robots on the chart with impressive speed to mass ratios are number 27, which is this magnetically driven quadruped robot from UMD, and number 86, UC Berkeley’s X2-VelociRoACH.

Anyway, back to this robot. Some other cool things about it:

You can step on it, squishing it flat with a load about 1 million times its own body weight, and it’ll keep on crawling, albeit only half as fast.
Even climbing a slope of 15 degrees, it can still manage to move at 1 body length per second.
It carries peanuts! With a payload of six times its own weight, it moves a sixth as fast, but still, it’s not like you need your peanuts delivered all that quickly anyway, do you?

Image: Science Robotics

The researchers also put together a prototype with two legs instead of one, which was able to demonstrate a potentially faster galloping gait by spending more time in the air. They suggest that robots like these could be used for “environmental exploration, structural inspection, information reconnaissance, and disaster relief,” which are the sorts of things that you suggest that your robot could be used for when you really have no idea what it could be used for. But this work is certainly impressive, with speed and robustness that are largely unmatched by other soft robots. An untethered version seems possible due to the relatively low voltages required to drive the robot, and if they can put some peanut-sized sensors on there as well, practical applications might actually be forthcoming sometime soon.

“Insect-scale Fast Moving and Ultrarobust Soft Robot,” by Yichuan Wu, Justin K. Yim, Jiaming Liang, Zhichun Shao, Mingjing Qi, Junwen Zhong, Zihao Luo, Xiaojun Yan, Min Zhang, Xiaohao Wang, Ronald S. Fearing, Robert J. Full, and Liwei Lin from Tsinghua University and UC Berkeley, is published in Science Robotics. Continue reading

Posted in Human Robots

#435646 Video Friday: Kiki Is a New Social Robot ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

The DARPA Subterranean Challenge tunnel circuit takes place in just a few weeks, and we’ll be there!

[ DARPA SubT ]

Time lapse video of robotic arm on NASA’s Mars 2020 rover handily maneuvers 88-pounds (40 kilograms) worth of sensor-laden turret as it moves from a deployed to stowed configuration.

If you haven’t read our interview with Matt Robinson, now would be a great time, since he’s one of the folks at JPL who designed this arm.

[ Mars 2020 ]

Kiki is a small, white, stationary social robot with an evolving personality who promises to be your friend and costs $800 and is currently on Kickstarter.

The Kickstarter page is filled with the same type of overpromising that we’ve seen with other (now very dead) social robots: Kiki is “conscious,” “understands your feelings,” and “loves you back.” Oof. That said, we’re happy to see more startups trying to succeed in this space, which is certainly one of the toughest in consumer electronics, and hopefully they’ve been learning from the recent string of failures. And we have to say Kiki is a cute robot. Its overall design, especially the body mechanics and expressive face, look neat. And kudos to the team—the company was founded by two ex-Googlers, Mita Yun and Jitu Das—for including the “unedited prototype videos,” which help counterbalance the hype.

Another thing that Kiki has going for it is that everything runs on the robot itself. This simplifies privacy and means that the robot won’t partially die on you if the company behind it goes under, but also limits how clever the robot will be able to be. The Kickstarter campaign is already over a third funded, so…We’ll see.

[ Kickstarter ]

When your UAV isn’t enough UAV, so you put a UAV on your UAV.

[ CanberraUAV ]

ABB’s YuMi is testing ATMs because a human trying to do this task would go broke almost immediately.

[ ABB ]

DJI has a fancy new FPV system that features easy setup, digital HD streaming at up to 120 FPS, and <30ms latency.

If it looks expensive, that’s because it costs $930 with the remote included.

[ DJI ]

Honeybee Robotics has recently developed a regolith excavation and rock cleaning system for NASA JPL’s PUFFER rovers. This system, called POCCET (PUFFER-Oriented Compact Cleaning and Excavation Tool), uses compressed gas to perform all excavation and cleaning tasks. Weighing less than 300 grams with potential for further mass reduction, POCCET can be used not just on the Moon, but on other Solar System bodies such as asteroids, comets, and even Mars.

[ Honeybee Robotics ]

DJI’s 2019 RoboMaster tournament, which takes place this month in Shenzen, looks like it’ll be fun to watch, with a plenty of action and rules that are easy to understand.

[ RoboMaster ]

Robots and baked goods are an automatic Video Friday inclusion.

Wow I want a cupcake right now.

[ Soft Robotics ]

The ICRA 2019 Best Paper Award went to Michelle A. Lee at Stanford, for “Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks.”

The ICRA video is here, and you can find the paper at the link below.

[ Paper ] via [ RoboHub ]

Cobalt Robotics put out a bunch of marketing-y videos this week, but this one reasonably interesting, even if you’re familiar with what they’re doing over there.

[ Cobalt Robotics ]

RightHand Robotics launched RightPick2 with a gala event which looked like fun as long as you were really, really in to robots.

[ RightHand Robotics ]

Thanks Jeff!

This video presents a framework for whole-body control applied to the assistive robotic system EDAN. We show how the proposed method can be used for a task like open, pass through and close a door. Also, we show the efficiency of the whole-body coordination with controlling the end-effector with respect to a fixed reference. Additionally, showing how easy the system can be manually manoeuvred by direct interaction with the end-effector, without the need for an extra input device.

[ DLR ]

You’ll probably need to turn on auto-translated subtitles for most of this, but it’s worth it for the adorable little single-seat robotic car designed to help people get around airports.

[ ZMP ]

In this week’s episode of Robots in Depth, Per speaks with Gonzalo Rey from Moog about their fancy 3D printed integrated hydraulic actuators.

Gonzalo talks about how Moog got started with hydraulic control,taking part in the space program and early robotics development. He shares how Moog’s technology is used in fly-by-wire systems in aircraft and in flow control in deep space probes. They have even reached Mars.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435642 Drone X Challenge 2020

Krypto Labs opens applications for Drone X Challenge 2020 Phase II, a US$1.5+ Million Global Challenge (US$1 Million Final Prize and US$500,000+ in R&D Grants)

In its most rewarding initiative to date, Krypto Labs, the global innovation hub with a unique ecosystem for funding ground-breaking startups, has announced the opening of Phase II of Drone X Challenge (DXC) 2020, the global multimillion-dollar challenge that is pushing the frontiers of innovation in drone technologies focusing on high payload capacity and high flight endurance.

Drone X Challenge 2020 is open to entrepreneurs, start-ups, researchers, university students and established companies. Teams that want to apply for Drone X Challenge 2020 Phase II will have to develop a drone system capable of achieving the minimum endurance and payload as per the category they are applying to.

Categories:

Fixed-wing drones battery powered
Fixed-wing drones hybrid/hydrocarbon powered
Multi-rotor drones battery powered
Multi-rotor drones hybrid/hydrocarbon powered

Drone X Challenge 2020 is divided in 3 phases and a final event, providing US$1 Million Final Prize. The outstanding applications that meet the requirements of Phase II will collectively receive US$300,000 in R&D grants.

The shortlisted teams of Phase I received US$320,000 in R&D grants, which required applicants to provide a technical proposal detailing the design of a drone capable of meeting the minimum requirements of payload and endurance.

The shortlisted teams of Drone X Challenge 2020 Phase I are:

RigiTech from Switzerland
Forward Robotics from Canada
Industrial Technology Research Institute (ITRI) from Taiwan
KopterKraft from Germany
DV8 Tech from USA
Richen Power from China
Industrial Technology Research Institute (ITRI) from Taiwan
Vulcan UAV Ltd from UK

Dr. Saleh Al Hashemi, Managing Director of Krypto Labs said: “This competition aligns with our efforts in contributing to the development of drone technology globally. We aim to redefine the way drone technologies are impacting our lives, and Krypto Labs is proud to be leading the way in the region by supporting startups, established companies, and industries involved in the field of drone development. By catalyzing and supporting these cutting-edge solutions, we aim to continue leveraging disruptive technologies that can create value and make an impact.”

For more information about Drone X Challenge 2020, please visit https://dronexchallenge2020.com. Continue reading

Posted in Human Robots

#435640 Video Friday: This Wearable Robotic Tail ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
CLAWAR 2019 – August 26-28, 2019 – Kuala Lumpur, Malaysia
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
IROS 2019 – November 4-8, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

Lakshmi Nair from Georgia Tech describes some fascinating research towards robots that can create their own tools, as presented at ICRA this year:

Using a novel capability to reason about shape, function, and attachment of unrelated parts, researchers have for the first time successfully trained an intelligent agent to create basic tools by combining objects.

The breakthrough comes from Georgia Tech’s Robot Autonomy and Interactive Learning (RAIL) research lab and is a significant step toward enabling intelligent agents to devise more advanced tools that could prove useful in hazardous – and potentially life-threatening – environments.

[ Lakshmi Nair ]

Victor Barasuol, from the Dynamic Legged Systems Lab at IIT, wrote in to share some new research on their HyQ quadruped that enables sensorless shin collision detection. This helps the robot navigate unstructured environments, and also mitigates all those painful shin strikes, because ouch.

This will be presented later this month at the International Conference on Climbing and Walking Robots (CLAWAR) in Kuala Lumpur, Malaysia.

[ IIT ]

Thanks Victor!

You used to have a tail, you know—as an embryo, about a month in to your development. All mammals used to have tails, and now we just have useless tailbones, which don’t help us with balancing even a little bit. BRING BACK THE TAIL!

The tail, created by Junichi Nabeshima, Kouta Minamizawa, and MHD Yamen Saraiji from Keio University’s Graduate School of Media Design, was presented at SIGGRAPH 2019 Emerging Technologies.

[ Paper ] via [ Gizmodo ]

The noises in this video are fantastic.

[ ESA ]

Apparently the industrial revolution wasn’t a thorough enough beatdown of human knitting, because the robots are at it again.

[ MIT CSAIL ]

Skydio’s drones just keep getting more and more impressive. Now if only they’d make one that I can afford…

[ Skydio ]

The only thing more fun than watching robots is watching people react to robots.

[ SEER ]

There aren’t any robots in this video, but it’s robotics-related research, and very soothing to watch.

[ Stanford ]

#autonomousicecreamtricycle

In case it wasn’t clear, which it wasn’t, this is a Roboy project. And if you didn’t understand that first video, you definitely won’t understand this second one:

Whatever that t-shirt is at the end (Roboy in sunglasses puking rainbows…?) I need one.

[ Roboy ]

By adding electronics and computation technology to a simple cane that has been around since ancient times, a team of researchers at Columbia Engineering have transformed it into a 21st century robotic device that can provide light-touch assistance in walking to the aged and others with impaired mobility.

The light-touch robotic cane, called CANINE, acts as a cane-like mobile assistant. The device improves the individual’s proprioception, or self-awareness in space, during walking, which in turn improves stability and balance.

[ ROAR Lab ]

During the second field experiment for DARPA’s OFFensive Swarm-Enabled Tactics (OFFSET) program, which took place at Fort Benning, Georgia, teams of autonomous air and ground robots tested tactics on a mission to isolate an urban objective. Similar to the way a firefighting crew establishes a boundary around a burning building, they first identified locations of interest and then created a perimeter around the focal point.

[ DARPA ]

I think there’s a bit of new footage here of Ghost Robotics’ Vision 60 quadruped walking around without sensors on unstructured terrain.

[ Ghost Robotics ]

If you’re as tired of passenger drone hype as I am, there’s absolutely no need to watch this video of NEC’s latest hover test.

[ AP ]

As researchers teach robots to perform more and more complex tasks, the need for realistic simulation environments is growing. Existing techniques for closing the reality gap by approximating real-world physics often require extensive real world data and/or thousands of simulation samples. This paper presents TuneNet, a new machine learning-based method to directly tune the parameters of one model to match another using an iterative residual tuning technique. TuneNet estimates the parameter difference between two models using a single observation from the target and minimal simulation, allowing rapid, accurate and sample-efficient parameter estimation.

The system can be trained via supervised learning over an auto-generated simulated dataset. We show that TuneNet can perform system identification, even when the true parameter values lie well outside the distribution seen during training, and demonstrate that simulators tuned with TuneNet outperform existing techniques for predicting rigid body motion. Finally, we show that our method can estimate real-world parameter values, allowing a robot to perform sim-to-real task transfer on a dynamic manipulation task unseen during training. We are also making a baseline implementation of our code available online.

[ Paper ]

Here’s an update on what GITAI has been up to with their telepresence astronaut-replacement robot.

[ GITAI ]

Curiosity captured this 360-degree panorama of a location on Mars called “Teal Ridge” on June 18, 2019. This location is part of a larger region the rover has been exploring called the “clay-bearing unit” on the side of Mount Sharp, which is inside Gale Crater. The scene is presented with a color adjustment that approximates white balancing to resemble how the rocks and sand would appear under daytime lighting conditions on Earth.

[ MSL ]

Some updates (in English) on ROS from ROSCon France. The first is a keynote from Brian Gerkey:

And this second video is from Omri Ben-Bassat, about how to keep your Anki Vector alive using ROS:

All of the ROSCon FR talks are available on Vimeo.

[ ROSCon FR ] Continue reading

Posted in Human Robots

#435634 Robot Made of Clay Can Sculpt Its Own ...

We’re very familiar with a wide variety of transforming robots—whether for submarines or drones, transformation is a way of making a single robot adaptable to different environments or tasks. Usually, these robots are restricted to a discrete number of configurations—perhaps two or three different forms—because of the constraints imposed by the rigid structures that robots are typically made of.

Soft robotics has the potential to change all this, with robots that don’t have fixed forms but instead can transform themselves into whatever shape will enable them to do what they need to do. At ICRA in Montreal earlier this year, researchers from Yale University demonstrated a creative approach toward a transforming robot powered by string and air, with a body made primarily out of clay.

Photo: Evan Ackerman

The robot is actuated by two different kinds of “skin,” one layered on top of another. There’s a locomotion skin, made of a pattern of pneumatic bladders that can roll the robot forward or backward when the bladders are inflated sequentially. On top of that is the morphing skin, which is cable-driven, and can sculpt the underlying material into a variety of shapes, including spheres, cylinders, and dumbbells. The robot itself consists of both of those skins wrapped around a chunk of clay, with the actuators driven by offboard power and control. Here it is in action:

The Yale researchers have been experimenting with morphing robots that use foams and tensegrity structures for their bodies, but that stuff provides a “restoring force,” springing back into its original shape once the actuation stops. Clay is different because it holds whatever shape it’s formed into, making the robot more energy efficient. And if the dumbbell shape stops being useful, the morphing layer can just squeeze it back into a cylinder or a sphere.

While this robot, and the sample transformation shown in the video, are relatively simplistic, the researchers suggest some ways in which a more complex version could be used in the future:

Photo: IEEE Xplore

This robot’s morphing skin sculpts its clay body into different shapes.

Applications where morphing and locomotion might serve as complementary functions are abundant. For the example skins presented in this work, a search-and-rescue operation could use the clay as a medium to hold a payload such as sensors or transmitters. More broadly, applications include resource-limited conditions where supply chains for materiel are sparse. For example, the morphing sequence shown in Fig. 4 [above] could be used to transform from a rolling sphere to a pseudo-jointed robotic arm. With such a morphing system, it would be possible to robotically morph matter into different forms to perform different functions.

Read this article for free on IEEE Xplore until 5 September 2019

Morphing Robots Using Robotic Skins That Sculpt Clay, by Dylan S. Shah, Michelle C. Yuen, Liana G. Tilton, Ellen J. Yang, and Rebecca Kramer-Bottiglio from Yale University, was presented at ICRA 2019 in Montreal.

[ Yale Faboratory ]

< Back to IEEE Journal Watch Continue reading

Posted in Human Robots