Tag Archives: robotics

#437789 Video Friday: Robotic Glove Features ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Evidently, the folks at Unitree were paying attention to last week’s Video Friday.

[ Unitree ]

RoboSoft 2020 was a virtual conference this year (along with everything else), but they still held a soft robots contest, and here are four short vids—you can watch the rest of them here.

[ RoboSoft 2020 ]

If you were wondering why SoftBank bought Aldebaran Robotics and Boston Dynamics, here’s the answer.

I am now a Hawks fan. GO HAWKS!

[ Softbank Hawks ] via [ RobotStart ]

Scientists at the University of Liverpool have developed a fully autonomous mobile robot to assist them in their research. Using a type of AI, the robot has been designed to work uninterrupted for weeks at a time, allowing it to analyse data and make decisions on what to do next. Using a flexible arm with customised gripper it can be calibrated to interact with most standard lab equipment and machinery as well as navigate safely around human co-workers and obstacles.

[ Nature ]

Oregon State’s Cassie has been on break for a couple of months, but it’s back in the lab and moving alarmingly quickly.

[ DRL ]

The current situation linked to COVID-19 sadly led to the postponing of this year RoboCup 2020 at Bordeaux. As an official sponsor of The RoboCup, SoftBank Robotics wanted to take this opportunity to thank all RoboCupers and The RoboCup Federation for their support these past 13 years. We invite you to take a look at NAO’s adventure at The RoboCup as the official robot of the Standard Platform League. See you in Bordeaux 2021!

[ RoboCup 2021 ]

Miniature SAW robot crawling inside the intestines of a pig. You’re welcome.

[ Zarrouk Lab ]

The video demonstrates fast autonomous flight experiments in cluttered unknown environments, with the support of a robust and perception-aware replanning framework called RAPTOR. The associated paper is submitted to TRO.

[ HKUST ]

Since we haven’t gotten autonomy quite right yet, there’s a lot of telepresence going on for robots that operate in public spaces. Usually, you’ve got one remote human managing multiple robots, so it would be nice to make that interface a little more friendly, right?

[ HCI Lab ]

Arguable whether or not this is a robot, but it’s cool enough to spend a minute watching.

[ Ishikawa Lab ]

Communication is critical to collaboration; however, too much of it can degrade performance. Motivated by the need for effective use of a robot’s communication modalities, in this work, we present a computational framework that decides if, when, and what to communicate during human-robot collaboration.

[ Interactive Robotics ]

Robotiq has released the next generation of the grippers for collaborative robots: the 2F-85 and 2F-140. Both models gain greater robustness, safety, and customizability while retaining the same key benefits that have inspired thousands of manufacturers to choose them since their launch 6 years ago.

[ Robotiq ]

ANYmal C, the autonomous legged robot designed for industrial challenging environments, provides the mobility, autonomy and inspection intelligence to enable safe and efficient inspection operations. In this virtual showcase, discover how ANYmal climbs stairs, recovers from a fall, performs an autonomous mission and avoids obstacles, docks to charge by itself, digitizes analogue sensors and monitors the environment.

[ ANYbotics ]

At Waymo, we are committed to addressing inequality, and we believe listening is a critical first step toward driving positive change. Earlier this year, five Waymonauts sat down to share their thoughts on equity at work, challenging the status quo, and more. This is what they had to say.

[ Waymo ]

Nice of ABB to take in old robots and upgrade them to turn them into new robots again. Robots forever!

[ ABB ]

It’s nice seeing the progress being made by GITAI, one of the teams competing in the ANA Avatar XPRIZE Challenge, and also meet the humans behind the robots.

[ GITAI ] via [ XPRIZE ]

One more talk from the ICRA Legged Robotics Workshop: Jingyu Liu from DeepRobotics and Qiuguo Zhu from Zhejiang University.

[ Deep Robotics ] Continue reading

Posted in Human Robots

#437783 Ex-Googler’s Startup Comes Out of ...

Over the last 10 years, the PR2 has helped roboticists make an enormous amount of progress in mobile manipulation over a relatively short time. I mean, it’s been a decade already, but still—robots are hard, and giving a bunch of smart people access to a capable platform where they didn’t have to worry about hardware and could instead focus on doing interesting and useful things helped to establish a precedent for robotics research going forward.

Unfortunately, not everyone can afford an enormous US $400,000 robot, and even if they could, PR2s are getting very close to the end of their lives. There are other mobile manipulators out there taking the place of the PR2, but so far, size and cost have largely restricted them to research labs. Lots of good research is being done, but it’s getting to the point where folks want to take the next step: making mobile manipulators real-world useful.

Today, a company called Hello Robot is announcing a new mobile manipulator called the Stretch RE1. With offices in the San Francisco Bay Area and in Atlanta, Ga., Hello Robot is led by Aaron Edsinger and Charlie Kemp, and by combining decades of experience in industry and academia they’ve managed to come up with a robot that’s small, lightweight, capable, and affordable, all at the same time. For now, it’s a research platform, but eventually, its creators hope that it will be able to come into our homes and take care of us when we need it to.

A fresh look at mobile manipulators
To understand the concept behind Stretch, it’s worth taking a brief look back at what Edsinger and Kemp have been up to for the past 10 years. Edsinger co-founded Meka Robotics in 2007, which built expensive, high performance humanoid arms, torsos, and heads for the research market. Meka was notable for being the first robotics company (as far as we know) to sell robot arms that used series elastic actuators, and the company worked extensively with Georgia Tech researchers. In 2011, Edsinger was one of the co-founders of Redwood Robotics (along with folks from SRI and Willow Garage), which was going to develop some kind of secret and amazing new robot arm before Google swallowed it in late 2013. At the same time, Google also acquired Meka and a bunch of other robotics companies, and Edsinger ended up at Google as one of the directors of its robotics program, until he left to co-found Hello Robot in 2017.

Meanwhile, since 2007 Kemp has been a robotics professor at Georgia Tech, where he runs the Healthcare Robotics Lab. Kemp’s lab was one of the 11 PR2 beta sites, giving him early experience with a ginormous mobile manipulator. Much of the research that Kemp has spent the last decade on involves robots providing assistance to untrained users, often through direct physical contact, and frequently either in their own homes or in a home environment. We should mention that the Georgia Tech PR2 is still going, most recently doing some clever material classification work in a paper for IROS later this year.

Photo: Hello Robot

Hello Robot co-founder and CEO Aaron Edsinger says that, although Stretch is currently a research platform, he hopes to see the robot deployed in home environments, adding that the “impact we want to have is through robots that are helpful to people in society.”

So with all that in mind, where’d Hello Robot come from? As it turns out, both Edsinger and Kemp were in Rodney Brooks’ group at MIT, so it’s perhaps not surprising that they share some of the same philosophies about what robots should be and what they should be used for. After collaborating on a variety of projects over the years, in 2017 Edsinger was thinking about his next step after Google when Kemp stopped by to show off some video of a new robot prototype that he’d been working on—the prototype for Stretch. “As soon as I saw it, I knew that was exactly the kind of thing I wanted to be working on,” Edsinger told us. “I’d become frustrated with the complexity of the robots being built to do manipulation in home environments and around people, and it solved a lot of problems in an elegant way.”

For Kemp, Stretch is an attempt to get everything he’s been teaching his robots out of his lab at Georgia Tech and into the world where it can actually be helpful to people. “Right from the beginning, we were trying to take our robots out to real homes and interact with real people,” says Kemp. Georgia Tech’s PR2, for example, worked extensively with Henry and Jane Evans, helping Henry (a quadriplegic) regain some of the bodily autonomy he had lost. With the assistance of the PR2, Henry was able to keep himself comfortable for hours without needing a human caregiver to be constantly with him. “I felt like I was making a commitment in some ways to some of the people I was working with,” Kemp told us. “But 10 years later, I was like, where are these things? I found that incredibly frustrating. Stretch is an effort to try to push things forward.”

A robot you can put in the backseat of a car
One way to put Stretch in context is to think of it almost as a reaction to the kitchen sink philosophy of the PR2. Where the PR2 was designed to be all the robot anyone could ever need (plus plenty of robot that nobody really needed) embodied in a piece of hardware that weighs 225 kilograms and cost nearly half a million dollars, Stretch is completely focused on being just the robot that is actually necessary in a form factor that’s both much smaller and affordable. The entire robot weighs a mere 23 kg in a footprint that’s just a 34 cm square. As you can see from the video, it’s small enough (and safe enough) that it can be moved by a child. The cost? At $17,950 apiece—or a bit less if you buy a bunch at once—Stretch costs a fraction of what other mobile manipulators sell for.

It might not seem like size or weight should be that big of an issue, but it very much is, explains Maya Cakmak, a robotics professor at the University of Washington, in Seattle. Cakmak worked with PR2 and Henry Evans when she was at Willow Garage, and currently has access to both a PR2 and a Fetch research robot. “When I think about my long term research vision, I want to deploy service robots in real homes,” Cakmak told us. Unfortunately, it’s the robots themselves that have been preventing her from doing this—both the Fetch and the PR2 are large enough that moving them anywhere requires a truck and a lift, which also limits the home that they can be used in. “For me, I felt immediately that Stretch is very different, and it makes a lot of sense,” she says. “It’s safe and lightweight, you can probably put it in the backseat of a car.” For Cakmak, Stretch’s size is the difference between being able to easily take a robot to the places she wants to do research in, and not. And cost is a factor as well, since a cheaper robot means more access for her students. “I got my refurbished PR2 for $180,000,” Cakmak says. “For that, with Stretch I could have 10!”

“I felt immediately that Stretch is very different. It’s safe and lightweight, you can probably put it in the backseat of a car. I got my refurbished PR2 for $180,000. For that, with Stretch I could have 10!”
—Maya Cakmak, University of Washington

Of course, a portable robot doesn’t do you any good if the robot itself isn’t sophisticated enough to do what you need it to do. Stretch is certainly a compromise in functionality in the interest of small size and low cost, but it’s a compromise that’s been carefully thought out, based on the experience that Edsinger has building robots and the experience that Kemp has operating robots in homes. For example, most mobile manipulators are essentially multi-degrees-of-freedom arms on mobile bases. Stretch instead leverages its wheeled base to move its arm in the horizontal plane, which (most of the time) works just as well as an extra DoF or two on the arm while saving substantially on weight and cost. Similarly, Stretch relies almost entirely on one sensor, an Intel RealSense D435i on a pan-tilt head that gives it a huge range of motion. The RealSense serves as a navigation camera, manipulation camera, a 3D mapping system, and more. It’s not going to be quite as good for a task that might involve fine manipulation, but most of the time it’s totally workable and you’re saving on cost and complexity.

Stretch has been relentlessly optimized to be the absolutely minimum robot to do mobile manipulation in a home or workplace environment. In practice, this meant figuring out exactly what it was absolutely necessary for Stretch to be able to do. With an emphasis on manipulation, that meant defining the workspace of the robot, or what areas it’s able to usefully reach. “That was one thing we really had to push hard on,” says Edsinger. “Reachability.” He explains that reachability and a small mobile base tend not to go together, because robot arms (which tend to weigh a lot) can cause a small base to tip, especially if they’re moving while holding a payload. At the same time, Stretch needed to be able to access both countertops and the floor, while being able to reach out far enough to hand people things without having to be right next to them. To come up with something that could meet all those requirements, Edsinger and Kemp set out to reinvent the robot arm.

Stretch’s key innovation: a stretchable arm
The design they came up with is rather ingenious in its simplicity and how well it works. Edsinger explains that the arm consists of five telescoping links: one fixed and four moving. They are constructed of custom carbon fiber, and are driven by a single motor, which is attached to the robot’s vertical pole. The strong, lightweight structure allows the arm to extend over half a meter and hold up to 1.5 kg. Although the company has a patent pending for the design, Edsinger declined to say whether the links are driven by a belt, cables, or gears. “We don’t want to disclose too much of the secret sauce [with regard to] the drive mechanism.” He added that the arm was “one of the most significant engineering challenges on the robot in terms of getting the desired reach, compactness, precision, smoothness, force sensitivity, and low cost to all happily coexist.”

Photo: Hello Robot

Stretch’s arm consists of five telescoping links constructed of custom carbon fiber, and are driven by a single motor, which is attached to the robot’s vertical pole, minimizing weight and inertia. The arm has a reach of over half a meter and can hold up to 1.5 kg.

Another interesting features of Stretch is its interface with the world—its gripper. There are countless different gripper designs out there, each and every one of which is the best at gripping some particular subset of things. But making a generalized gripper for all of the stuff that you’d find in a home is exceptionally difficult. Ideally, you’d want some sort of massive experimental test program where thousands and thousands of people test out different gripper designs in their homes for long periods of time and then tell you which ones work best. Obviously, that’s impractical for a robotics startup, but Kemp realized that someone else was already running the study for him: Amazon.

“I had this idea that there are these assistive grabbers that people with disabilities use to grasp objects in the real world,” he told us. Kemp went on Amazon’s website and looked at the top 10 grabbers and the reviews from thousands of users. He then bought a bunch of different ones and started testing them. “This one [Stretch’s gripper], I almost didn’t order it, it was such a weird looking thing,” he says. “But it had great reviews on Amazon, and oh my gosh, it just blew away the other grabbers. And I was like, that’s it. It just works.”

Stretch’s teleoperated and autonomous capabilities
As with any robot intended to be useful outside of a structured environment, hardware is only part of the story, and arguably not even the most important part. In order for Stretch to be able to operate out from under the supervision of a skilled roboticist, it has to be either easy to control, or autonomous. Ideally, it’s both, and that’s what Hello Robot is working towards, although things didn’t start out that way, Kemp explains. “From a minimalist standpoint, we began with the notion that this would be a teleoperated robot. But in the end, you just don’t get the real power of the robot that way, because you’re tied to a person doing stuff. As much as we fought it, autonomy really is a big part of the future for this kind of system.”

Here’s a look at some of Stretch’s teleoperated capabilities. We’re told that Stretch is very easy to get going right out of the box, although this teleoperation video from Hello Robot looks like it’s got a skilled and experienced user in the loop:

For such a low-cost platform, the autonomy (even at this early stage) is particularly impressive:

Since it’s not entirely clear from the video exactly what’s autonomous, here’s a brief summary of a couple of the more complex behaviors that Kemp sent us:

Object grasping: Stretch uses its 3D camera to find the nearest flat surface using a virtual overhead view. It then segments significant blobs on top of the surface. It selects the largest blob in this virtual overhead view and fits an ellipse to it. It then generates a grasp plan that makes use of the center of the ellipse and the major and minor axes. Once it has a plan, Stretch orients its gripper, moves to the pre-grasp pose, moves to the grasp pose, closes its gripper based on the estimated object width, lifts up, and retracts.
Mapping, navigating, and reaching to a 3D point: These demonstrations all use FUNMAP (Fast Unified Navigation, Manipulation and Planning). It’s all novel custom Python code. Even a single head scan performed by panning the 3D camera around can result in a very nice 3D representation of Stretch’s surroundings that includes the nearby floor. This is surprisingly unusual for robots, which often have their cameras too low to see many interesting things in a human environment. While mapping, Stretch selects where to scan next in a non-trivial way that considers factors such as the quality of previous observations, expected new observations, and navigation distance. The plan that Stretch uses to reach the target 3D point has been optimized for navigation and manipulation. For example, it finds a final robot pose that provides a large manipulation workspace for Stretch, which must consider nearby obstacles, including obstacles on the ground.
Object handover: This is a simple demonstration of object handovers. Stretch performs Cartesian motions to move its gripper to a body-relative position using a good motion heuristic, which is to extend the arm as the last step. These simple motions work well due to the design of Stretch. It still surprises me how well it moves the object to comfortable places near my body, and how unobtrusive it is. The goal point is specified relative to a 3D frame attached to the person’s mouth estimated using deep learning models (shown in the RViz visualization video). Specifically, Stretch targets handoff at a 3D point that is 20 cm below the estimated position of the mouth and 25 cm away along the direction of reaching.

Much of these autonomous capabilities come directly from Kemp’s lab, and the demo code is available for anyone to use. (Hello Robot says all of Stretch’s software is open source.)

Photo: Hello Robot

Hello Robot co-founder and CEO Aaron Edsinger says Stretch is designed to work with people in homes and workplaces and can be teleoperated to do a variety of tasks, including picking up toys, removing laundry from a dryer, and playing games with kids.

As of right now, Stretch is very much a research platform. You’re going to see it in research labs doing research things, and hopefully in homes and commercial spaces as well, but still under the supervision of professional roboticists. As you may have guessed, though, Hello Robot’s vision is a bit broader than that. “The impact we want to have is through robots that are helpful to people in society,” Edsinger says. “We think primarily in the home context, but it could be in healthcare, or in other places. But we really want to have our robots be impactful, and useful. To us, useful is exciting.” Adds Kemp: “I have a personal bias, but we’d really like this technology to benefit older adults and caregivers. Rather than creating a specialized assistive device, we want to eventually create an inexpensive consumer device for everyone that does lots of things.”

Neither Edsinger nor Kemp would say much more on this for now, and they were very explicit about why—they’re being deliberately cautious about raising expectations, having seen what’s happened to some other robotics companies over the past few years. Without VC funding (Hello Robot is currently bootstrapping itself into existence), Stretch is being sold entirely on its own merits. So far, it seems to be working. Stretch robots are already in a half dozen research labs, and we expect that with today’s announcement, we’ll start seeing them much more frequently.

This article appears in the October 2020 print issue as “A Robot That Keeps It Simple.” Continue reading

Posted in Human Robots

#437778 A Bug-Sized Camera for Bug-Sized Robots ...

As if it’s not hard enough to make very small mobile robots, once you’ve gotten the power and autonomy all figured out (good luck with that), your robot isn’t going to be all that useful unless it can carry some payload. And the payload that everybody wants robots to carry is a camera, which is of course a relatively big, heavy, power hungry payload. Great, just great.

This whole thing is frustrating because tiny, lightweight, power efficient vision systems are all around us. Literally, all around us right this second, stuffed into the heads of insects. We can’t make anything quite that brilliant (yet), but roboticists from the University of Washington, in Seattle, have gotten us a bit closer, with the smallest wireless, steerable video camera we’ve ever seen—small enough to fit on the back of a microbot, or even a live bug.

To make a camera this small, the UW researchers, led by Shyam Gollakota, a professor of computer science and engineering, had to start nearly from scratch, primarily because existing systems aren’t nearly so constrained by power availability. Even things like swallowable pill cameras require batteries that weigh more than a gram, but only power the camera for under half an hour. With a focus on small size and efficiency, they started with an off-the-shelf ultra low-power image sensor that’s 2.3 mm wide and weighs 6.7 mg. They stuck on a Bluetooth 5.0 chip (3 mm wide, 6.8 mg), and had a fun time connecting those two things together without any intermediary hardware to broadcast the camera output. A functional wireless camera also requires a lens (20 mg) and an antenna, which is just 5 mm of wire. An accelerometer is useful so that insect motion can be used to trigger the camera, minimizing the redundant frames that you’d get from a robot or an insect taking a nap.

Photo: University of Washington

The microcamera developed by the UW researchers can stream monochrome video at up to 5 frames per second to a cellphone 120 meters away.

The last bit to make up this system is a mechanically steerable “head,” weighing 35 mg and bringing the total weight of the wireless camera system to 84 mg. If the look of the little piezoelectric actuator seems familiar, you have very good eyes because it’s tiny, and also, it’s the same kind of piezoelectric actuator that the folks at UW use to power their itty bitty flying robots. It’s got a 60-degree panning range, but also requires a 96 mg boost converter to function, which is a huge investment in size and weight just to be able to point the camera a little bit. But overall, the researchers say that this pays off, because not having to turn the entire robot (or insect) when you want to look around reduces the energy consumption of the system as a whole by a factor of up to 84 (!).

Photo: University of Washington

Insects are very mobile platforms for outdoor use, but they’re also not easy to steer, so the researchers also built a little insect-scale robot that they could remotely control while watching the camera feed. As it turns out, this seems to be the smallest, power-autonomous terrestrial robot with a camera ever made.

This efficiency means that the wireless camera system can stream video frames (160×120 pixels monochrome) to a cell phone up to 120 meters away for up to 6 hours when powered by a 0.5-g, 10-mAh battery. A live, first-bug view can be streamed at up to 5 frames per second. The system was successfully tested on a pair of darkling beetles that were allowed to roam freely outdoors, and the researchers noted that they could also mount it on spiders or moths, or anything else that could handle the payload. (The researchers removed the electronics from the insects after the experiments and observed no noticeable adverse effects on their behavior.)

The researchers are already thinking about what it might take to put a wireless camera system on something that flies, and it’s not going to be easy—a bumblebee can only carry between 100 and 200 mg. The power system is the primary limitation here, but it might be possible to use a solar cell to cut down on battery requirements. And the camera itself could be scaled down as well, by using a completely custom sensor and a different type of lens. The other thing to consider is that with a long-range wireless link and a vision system, it’s possible to add sophisticated vision-based autonomy to tiny robots by doing the computation remotely. So, next time you see something scuttling across the ground, give it another look, because it might be looking right back at you.

“Wireless steerable vision for live insects and insect-scale robots,” by Vikram Iyer, Ali Najafi, Johannes James, Sawyer Fuller, and Shyamnath Gollakota from the University of Washington, is published in Science Robotics. Continue reading

Posted in Human Robots

#437776 Video Friday: This Terrifying Robot Will ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.

The Aigency, which created the FitBot launch video below, is “the world’s first talent management resource for robotic personalities.”

Robots will be playing a bigger role in our lives in the future. By learning to speak their language and work with them now, we can make this future better for everybody. If you’re a creator that’s producing content to entertain and educate people, robots can be a part of that. And we can help you. Robotic actors can show up alongside the rest of your actors.

The folks at Aigency have put together a compilation reel of clips they’ve put on TikTok, which is nice of them, because some of us don’t know how to TikTok because we’re old and boring.

Do googly eyes violate the terms and conditions?

[ Aigency ]

Shane Wighton of the “Stuff Made Here” YouTube channel, who you might remember from that robotic basketball hoop, has a new invention: A haircut robot. This is not the the first barber bot, but previous designs typically used hair clippers. Shane wanted his robot to use scissors. Hilarious and terrifying at once.

[ Stuff Made Here ]

Starting in October of 2016, Prof. Charlie Kemp and Henry M. Clever invented a new kind of robot. They named the prototype NewRo. In March of 2017, Prof. Kemp filmed this video of Henry operating NewRo to perform a number of assistive tasks. While visiting the Bay Area for a AAAI Symposium workshop at Stanford, Prof. Kemp showed this video to a select group of people to get advice, including Dr. Aaron Edsinger. In August of 2017, Dr. Edsinger and Dr. Kemp founded Hello Robot Inc. to commercialize this patent pending assistive technology. Hello Robot Inc. licensed the intellectual property (IP) from Georgia Tech. After three years of stealthy effort, Hello Robot Inc. revealed Stretch, a new kind of robot!

[ Georgia Tech ]

NASA’s Ingenuity Mars Helicopter will make history's first attempt at powered flight on another planet next spring. It is riding with the agency's next mission to Mars (the Mars 2020 Perseverance rover) as it launches from Cape Canaveral Air Force Station later this summer. Perseverance, with Ingenuity attached to its belly, will land on Mars February 18, 2021.

[ JPL ]

For humans, it can be challenging to manipulate thin flexible objects like ropes, wires, or cables. But if these problems are hard for humans, they are nearly impossible for robots. As a cable slides between the fingers, its shape is constantly changing, and the robot’s fingers must be constantly sensing and adjusting the cable’s position and motion. A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and from the MIT Department of Mechanical Engineering pursued the task from a different angle, in a manner that more closely mimics us humans. The team’s new system uses a pair of soft robotic grippers with high-resolution tactile sensors (and no added mechanical constraints) to successfully manipulate freely moving cables.

The team observed that it was difficult to pull the cable back when it reached the edge of the finger, because of the convex surface of the GelSight sensor. Therefore, they hope to improve the finger-sensor shape to enhance the overall performance. In the future, they plan to study more complex cable manipulation tasks such as cable routing and cable inserting through obstacles, and they want to eventually explore autonomous cable manipulation tasks in the auto industry.

[ MIT ]

Gripping robots typically have troubles grabbing transparent or shiny objects. A new technique by Carnegie Mellon University relies on color camera system and machine learning to recognize shapes based on color.

[ CMU ]

A new robotic prosthetic leg prototype offers a more natural, comfortable gait while also being quieter and more energy efficient than other designs. The key is the use of new small and powerful motors with fewer gears, borrowed from the space industry. This streamlined technology enables a free-swinging knee and regenerative braking, which charges the battery during use with energy that would typically be dissipated when the foot hits the ground. This feature enables the leg to more than double a typical prosthetic user's walking needs with one charge per day.

[ University of Michigan ]

Thanks Kate!

This year’s Wonder League teams have been put to the test not only with the challenges set forth by Wonder Workshop and Cartoon Network as they look to help the creek kids from Craig of the Creek solve the greatest mystery of all – the quest for the Lost Realm but due to forces outside their control. With a global pandemic displacing many teams from one another due to lockdowns and quarantines, these teams continued to push themselves to find new ways to work together, solve problems, communicate more effectively, and push themselves to complete a journey that they started and refused to give up on. We at Wonder Workshop are humbled and in awe of all these teams have accomplished.

[ Wonder Workshop ]

Thanks Nicole!

Meet Colin Creager, a mechanical engineer at NASA's Glenn Research Center. Colin is focusing on developing tires that can be used on other worlds. These tires use coil springs made of a special shape memory alloy that will let rovers move across sharp jagged rocks or through soft sand on the Moon or Mars.

[ NASA ]

To be presented at IROS this year, “the first on robot collision detection system using low cost microphones.”

[ Rutgers ]

Robot and mechanism designs inspired by the art of Origami have the potential to generate compact, deployable, lightweight morphing structures, as seen in nature, for potential applications in search-and-rescue, aerospace systems, and medical devices. However, it is challenging to obtain actuation that is easily patternable, reversible, and made with a scalable manufacturing process for origami-inspired self-folding machines. In this work, we describe an approach to design reversible self-folding machines using liquid crystal elastomer (LCE), that contracts when heated, as an artificial muscle.

[ UCSD ]

Just in case you need some extra home entertainment, and you’d like cleaner floors at the same time.

[ iRobot ]

Sure, toss it from a drone. Or from orbit. Whatever, it’s squishy!

[ Squishy Robotics ]

The [virtual] RSS conference this week featured an excellent lineup of speakers and panels, and the best part about it being virtual is that you can watch them all at your leisure! Here’s what’s been posted so far:

[ RSS 2020 ]

Lockheed Martin Robotics Seminar: Toward autonomous flying insect-sized robots: recent results in fabrication, design, power systems, control, and sensing with Sawyer Fuller.

[ UMD ]

In this episode of the AI Podcast, Lex interviews Sergey Levine.

[ AI Podcast ] Continue reading

Posted in Human Robots

#437765 Video Friday: Massive Robot Joins ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AWS Cloud Robotics Summit – August 18-19, 2020 – [Online Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Here are some professional circus artists messing around with an industrial robot for fun, like you do.

The acrobats are part of Östgötateatern, a Swedish theatre group, and the chair bit got turned into its own act, called “The Last Fish.” But apparently the Swedish Work Environment Authority didn’t like that an industrial robot—a large ABB robotic arm—was being used in an artistic performance, arguing that the same safety measures that apply in a factory setting would apply on stage. In other words, the robot had to operate inside a protective cage and humans could not physically interact with it.

When told that their robot had to be removed, the acrobats went to court. And won! At least that’s what we understand from this Swedish press release. The court in Linköping, in southern Sweden, ruled that the safety measures taken by the theater had been sufficient. The group had worked with a local robotics firm, Dyno Robotics, to program the manipulator and learn how to interact with it as safely as possible. The robot—which the acrobats say is the eighth member of their troupe—will now be allowed to return.

[ Östgötateatern ]

Houston Mechathronics’ Aquanaut continues to be awesome, even in the middle of a pandemic. It’s taken the big step (big swim?) out of NASA’s swimming pool and into open water.

[ HMI ]

Researchers from Carnegie Mellon University and Facebook AI Research have created a navigation system for robots powered by common sense. The technique uses machine learning to teach robots how to recognize objects and understand where they’re likely to be found in house. The result allows the machines to search more strategically.

[ CMU ]

Cassie manages 2.1 m/s, which is uncomfortably fast in a couple of different ways.

Next, untethered. After that, running!

[ Michigan Robotics ]

Engineers at Caltech have designed a new data-driven method to control the movement of multiple robots through cluttered, unmapped spaces, so they do not run into one another.

Multi-robot motion coordination is a fundamental robotics problem with wide-ranging applications that range from urban search and rescue to the control of fleets of self-driving cars to formation-flying in cluttered environments. Two key challenges make multi-robot coordination difficult: first, robots moving in new environments must make split-second decisions about their trajectories despite having incomplete data about their future path; second, the presence of larger numbers of robots in an environment makes their interactions increasingly complex (and more prone to collisions).

To overcome these challenges, Soon-Jo Chung, Bren Professor of Aerospace, and Yisong Yue, professor of computing and mathematical sciences, along with Caltech graduate student Benjamin Rivière (MS ’18), postdoctoral scholar Wolfgang Hönig, and graduate student Guanya Shi, developed a multi-robot motion-planning algorithm called “Global-to-Local Safe Autonomy Synthesis,” or GLAS, which imitates a complete-information planner with only local information, and “Neural-Swarm,” a swarm-tracking controller augmented to learn complex aerodynamic interactions in close-proximity flight.

[ Caltech ]

Fetch Robotics’ Freight robot is now hauling around pulsed xenon UV lamps to autonomously disinfect spaces with UV-A, UV-B, and UV-C, all at the same time.

[ SmartGuard UV ]

When you’re a vertically symmetrical quadruped robot, there is no upside-down.

[ Ghost Robotics ]

In the virtual world, the objects you pick up do not exist: you can see that cup or pen, but it does not feel like you’re touching them. That presented a challenge to EPFL professor Herbert Shea. Drawing on his extensive experience with silicone-based muscles and motors, Shea wanted to find a way to make virtual objects feel real. “With my team, we’ve created very small, thin and fast actuators,” explains Shea. “They are millimeter-sized capsules that use electrostatic energy to inflate and deflate.” The capsules have an outer insulating membrane made of silicone enclosing an inner pocket filled with oil. Each bubble is surrounded by four electrodes, that can close like a zipper. When a voltage is applied, the electrodes are pulled together, causing the center of the capsule to swell like a blister. It is an ingenious system because the capsules, known as HAXELs, can move not only up and down, but also side to side and around in a circle. “When they are placed under your fingers, it feels as though you are touching a range of different objects,” says Shea.

[ EPFL ]

Through the simple trick of reversing motors on impact, a quadrotor can land much more reliably on slopes.

[ Sherbrooke ]

Turtlebot delivers candy at Harvard.

I <3 Turtlebot SO MUCH

[ Harvard ]

Traditional drone controllers are a little bit counterintuitive, because there’s one stick that’s forwards and backwards and another stick that’s up and down but they’re both moving on the same axis. How does that make sense?! Here’s a remote that gives you actual z-axis control instead.

[ Fenics ]

Thanks Ashley!

Lio is a mobile robot platform with a multifunctional arm explicitly designed for human-robot interaction and personal care assistant tasks. The robot has already been deployed in several health care facilities, where it is functioning autonomously, assisting staff and patients on an everyday basis.

[ F&P Robotics ]

Video shows a ground vehicle autonomously exploring and mapping a multi-storage garage building and a connected patio on Carnegie Mellon University campus. The vehicle runs onboard state estimation and mapping leveraging range, vision, and inertial sensing, local planning for collision avoidance, and terrain analysis. All processing is real-time and no post-processing involved. The vehicle drives at 2m/s through the exploration run. This work is dedicated to DARPA Subterranean Challange.

[ CMU ]

Raytheon UK’s flagship STEM programme, the Quadcopter Challenge, gives 14-15 year olds the chance to participate in a hands-on, STEM-based engineering challenge to build a fully operational quadcopter. Each team is provided with an identical kit of parts, tools and instructions to build and customise their quadcopter, whilst Raytheon UK STEM Ambassadors provide mentoring, technical support and deliver bite-size learning modules to support the build.

[ Raytheon ]

A video on some of the research work that is being carried out at The Australian Centre for Field Robotics, University of Sydney.

[ University of Sydney ]

Jeannette Bohg, assistant professor of computer science at Stanford University, gave one of the Early Career Award Keynotes at RSS 2020.

[ RSS 2020 ]

Adam Savage remembers Grant Imahara.

[ Tested ] Continue reading

Posted in Human Robots