Tag Archives: Robotic System

#435752 T-RHex Is a Hexapod Robot With ...

In Aaron Johnson’s “Robot Design & Experimentation” class at CMU, teams of students have a semester to design and build an experimental robotic system based on a theme. For spring 2019, that theme was “Bioinspired Robotics,” which is definitely one of our favorite kinds of robotics—animals can do all kinds of crazy things, and it’s always a lot of fun watching robots try to match them. They almost never succeed, of course, but even basic imitation can lead to robots with some unique capabilities.

One of the projects from this year’s course, from Team ScienceParrot, is a new version of RHex called T-RHex (pronounced T-Rex, like the dinosaur). T-RHex comes with a tail, but more importantly, it has tiny tapered toes, which help it grip onto rough surfaces like bricks, wood, and concrete. It’s able to climb its way up very steep slopes, and hang from them, relying on its toes to keep itself from falling off.

T-RHex’s toes are called microspines, and we’ve seen them in all kinds of robots. The most famous of these is probably JPL’s LEMUR IIB (which wins on sheer microspine volume), although the concept goes back at least 15 years to Stanford’s SpinyBot. Robots that use microspines to climb tend to be fairly methodical at it, since the microspines have to be engaged and disengaged with care, limiting their non-climbing mobility.

T-RHex manages to perform many of the same sorts of climbing and hanging maneuvers without losing RHex’s ability for quick, efficient wheel-leg (wheg) locomotion.

If you look closely at T-RHex walking in the video, you’ll notice that in its normal forward gait, it’s sort of walking on its ankles, rather than its toes. This means that the microspines aren’t engaged most of the time, so that the robot can use its regular wheg motion to get around. To engage the microspines, the robot moves its whegs backwards, meaning that its tail is arguably coming out of its head. But since all of T-RHex’s capability is mechanical in nature and it has no active sensors, it doesn’t really need a head, so that’s fine.

The highest climbable slope that T-RHex could manage was 55 degrees, meaning that it can’t, yet, conquer vertical walls. The researchers were most surprised by the robot’s ability to cling to surfaces, where it was perfectly happy to hang out on a slope of 135 degrees, which is a 45 degree overhang (!). I have no idea how it would ever reach that kind of position on its own, but it’s nice to know that if it ever does, its spines will keep doing their job.

Photo: CMU

T-RHex uses laser-cut acrylic legs, with the microspines embedded into 3D-printed toes. The tail is needed to prevent the robot from tipping backward.

For more details about the project, we spoke with Team ScienceParrot member (and CMU PhD student) Catherine Pavlov via email.

IEEE Spectrum: We’re used to seeing RHex with compliant, springy legs—how do the new legs affect T-RHex’s mobility?

Catherine Pavlov: There’s some compliance in the legs, though not as much as RHex—this is driven by the use of acrylic, which was chosen for budget/manufacturing reasons. Matching the compliance of RHex with acrylic would have made the tines too weak (since often only a few hold the load of the robot during climbing). It definitely means you can’t use energy storage in the legs the way RHex does, for example when pronking. T-RHex is probably more limited by motor speed in terms of mobility though. We were using some borrowed Dynamixels that didn’t allow for good positioning at high speeds.

How did you design the climbing gait? Why not use the middle legs, and why is the tail necessary?

The gait was a lot of hand-tuning and trial-and-error. We wanted a left/right symmetric gait to enable load sharing among more spines and prevent out-of-plane twisting of the legs. When using all three pairs, you have to have very accurate angular positioning or one leg pair gets pushed off the wall. Since two legs should be able to hold the full robot gait, using the middle legs was hurting more than it was helping, with the middle legs sometimes pushing the rear ones off of the wall.

The tail is needed to prevent the robot from tipping backward and “sitting” on the wall. During static testing we saw the robot tip backward, disengaging the front legs, at around 35 degrees incline. The tail allows us to load the front legs, even when they’re at a shallow angle to the surface. The climbing gait we designed uses the tail to allow the rear legs to fully recirculate without the robot tipping backward.

Photo: CMU

Team ScienceParrot with T-RHex.

What prevents T-RHex from climbing even steeper surfaces?

There are a few limiting factors. One is that the tines of the legs break pretty easily. I think we also need a lighter platform to get fully vertical—we’re going to look at MiniRHex for future work. We’re also not convinced our gait is the best it can be, we can probably get marginal improvements with more tuning, which might be enough.

Can the microspines assist with more dynamic maneuvers?

Dynamic climbing maneuvers? I think that would only be possible on surfaces with very good surface adhesion and very good surface strength, but it’s certainly theoretically possible. The current instance of T-RHex would definitely break if you tried to wall jump though.

What are you working on next?

Our main target is exploring the space of materials for leg fabrication, such as fiberglass, PLA, urethanes, and maybe metallic glass. We think there’s a lot of room for improvement in the leg material and geometry. We’d also like to see MiniRHex equipped with microspines, which will require legs about half the scale of what we built for T-RHex. Longer-term improvements would be the addition of sensors e.g. for wall detection, and a reliable floor-to-wall transition and dynamic gait transitions.

[ T-RHex ] Continue reading

Posted in Human Robots

#435683 How High Fives Help Us Get in Touch With ...

The human sense of touch is so naturally ingrained in our everyday lives that we often don’t notice its presence. Even so, touch is a crucial sensing ability that helps people to understand the world and connect with others. As the market for robots grows, and as robots become more ingrained into our environments, people will expect robots to participate in a wide variety of social touch interactions. At Oregon State University’s Collaborative Robotics and Intelligent Systems (CoRIS) Institute, I research how to equip everyday robots with better social-physical interaction skills—from playful high-fives to challenging physical therapy routines.

Some commercial robots already possess certain physical interaction skills. For example, the videoconferencing feature of mobile telepresence robots can keep far-away family members connected with one another. These robots can also roam distant spaces and bump into people, chairs, and other remote objects. And my Roomba occasionally tickles my toes before turning to vacuum a different area of the room. As a human being, I naturally interpret this (and other Roomba behaviors) as social, even if they were not intended as such. At the same time, for both of these systems, social perceptions of the robots’ physical interaction behaviors are not well understood, and these social touch-like interactions cannot be controlled in nuanced ways.

Before joining CoRIS early this year, I was a postdoc at the University of Southern California’s Interaction Lab, and prior to that, I completed my doctoral work at the GRASP Laboratory’s Haptics Group at the University of Pennsylvania. My dissertation focused on improving the general understanding of how robot control and planning strategies influence perceptions of social touch interactions. As part of that research, I conducted a study of human-robot hand-to-hand contact, focusing on an interaction somewhere between a high five and a hand-clapping game. I decided to study this particular interaction because people often high five, and they will likely expect robots in everyday spaces to high five as well!

I conducted a study of human-robot hand-to-hand contact, focusing on an interaction somewhere between a high five and a hand-clapping game. I decided to study this particular interaction because people often high five, and they will likely expect robots to high five as well!

The implications of motion and planning on the social touch experience in these interactions is also crucial—think about a disappointingly wimpy (or triumphantly amazing) high five that you’ve experienced in the past. This great or terrible high-fiving experience could be fleeting, but it could also influence who you interact with, who you’re friends with, and even how you perceive the character or personalities of those around you. This type of perception, judgement, and response could extend to personal robots, too!

An investigation like this requires a mixture of more traditional robotics research (e.g., understanding how to move and control a robot arm, developing models of the desired robot motion) along with techniques from design and psychology (e.g., performing interviews with research participants, using best practices from experimental methods in perception). Enabling robots with social touch abilities also comes with many challenges, and even skilled humans can have trouble anticipating what another person is about to do. Think about trying to make satisfying hand contact during a high five—you might know the classic adage “watch the elbow,” but if you’re like me, even this may not always work.

I conducted a research study involving eight different types of human-robot hand contact, with different combinations of the following: interactions with a facially reactive or non-reactive robot, a physically reactive or non-reactive planning strategy, and a lower or higher robot arm stiffness. My robotic system could become facially reactive by changing its facial expression in response to hand contact, or physically reactive by updating its plan of where to move next after sensing hand contact. The stiffness of the robot could be adjusted by changing a variable that controlled how quickly the robot’s motors tried to pull its arm to the desired position. I knew from previous research that fine differences in touch interactions can have a big impact on perceived robot character. For example, if a robot grips an object too tightly or for too long while handing an object to a person, it might be perceived as greedy, possessive, or perhaps even Sméagol-like. A robot that lets go too soon might appear careless or sloppy.

In the example cases of robot grip, it’s clear that understanding people’s perceptions of robot characteristics and personality can help roboticists choose the right robot design based on the proposed operating environment of the robot. I likewise wanted to learn how the facial expressions, physical reactions, and stiffness of a hand-clapping robot would influence human perceptions of robot pleasantness, energeticness, dominance, and safety. Understanding this relationship can help roboticists to equip robots with personalities appropriate for the task at hand. For example, a robot assisting people in a grocery store may need to be designed with a high level of pleasantness and only moderate energy, while a maximally effective robot for comedy roast battles may need high degrees of energy and dominance above all else.

After many a late night at the GRASP Lab clapping hands with a big red robot, I was ready to conduct the study. Twenty participants visited the lab to clap hands with our Baxter Research Robot and help me begin to understand how characteristics of this humanoid robot’s social touch influenced its pleasantness, energeticness, dominance, and apparent safety. Baxter interacted with participants using a custom 3D-printed hand that was inlaid with silicone inserts.

The study showed that a facially reactive robot seemed more pleasant and energetic. A physically reactive robot seemed less pleasant, energetic, and dominant for this particular study design and interaction. I thought contact with a stiffer robot would seem harder (and therefore more dominant and less safe), but counter to my expectations, a stiffer-armed robot seemed safer and less dominant to participants. This may be because the stiffer robot was more precise in following its pre-programmed trajectory, therefore seeming more predictable and less free-spirited.

Safety ratings of the robot were generally high, and several participants commented positively on the robot’s facial expressions. Some participants attributed inventive (and non-existent) intelligences to the robot—I used neither computer vision nor the Baxter robot’s cameras in this study, but more than one participant complimented me on how well the robot tracked their hand position. While interacting with the robot, participants displayed happy facial expressions more than any other analyzed type of expression.

Photo: Naomi Fitter

Participants were asked to clap hands with Baxter and describe how they perceived the robot in terms of its pleasantness, energeticness, dominance, and apparent safety.

Circling back to the idea of how people might interpret even rudimentary and practical robot behaviors as social, these results show that this type of social perception isn’t just true for my lovable (but sometimes dopey) Roomba, but also for collaborative industrial robots, and generally, any robot capable of physical human-robot interaction. In designing the motion of Baxter, the adjustment of a single number in the equation that controls joint stiffness can flip the robot from seeming safe and docile to brash and commanding. These implications are sometimes predictable, but often unexpected.

The results of this particular study give us a partial guide to manipulating the emotional experience of robot users by adjusting aspects of robot control and planning, but future work is needed to fully understand the design space of social touch. Will materials play a major role? How about personalized machine learning? Do results generalize over all robot arms, or even a specialized subset like collaborative industrial robot arms? I’m planning to continue answering these questions, and when I finally solve human-robot social touch, I’ll high five all my robots to celebrate.

Naomi Fitter is an assistant professor in the Collaborative Robotics and Intelligent Systems (CoRIS) Institute at Oregon State University, where her Social Haptics, Assistive Robotics, and Embodiment (SHARE) research group aims to equip robots with the ability to engage and empower people in interactions from playful high-fives to challenging physical therapy routines. She completed her doctoral work in the GRASP Laboratory’s Haptics Group and was a postdoctoral scholar in the University of Southern California’s Interaction Lab from 2017 to 2018. Naomi’s not-so-secret pastime is performing stand-up and improv comedy. Continue reading

Posted in Human Robots

#435681 Video Friday: This NASA Robot Uses ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2019 – July 29-30, 2019 – London, U.K.
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Let us know if you have suggestions for next week, and enjoy today’s videos.

Robots can land on the Moon and drive on Mars, but what about the places they can’t reach? Designed by engineers as NASA’s Jet Propulsion Laboratory in Pasadena, California, a four-limbed robot named LEMUR (Limbed Excursion Mechanical Utility Robot) can scale rock walls, gripping with hundreds of tiny fishhooks in each of its 16 fingers and using artificial intelligence to find its way around obstacles. In its last field test in Death Valley, California, in early 2019, LEMUR chose a route up a cliff, scanning the rock for ancient fossils from the sea that once filled the area.

The LEMUR project has since concluded, but it helped lead to a new generation of walking, climbing and crawling robots. In future missions to Mars or icy moons, robots with AI and climbing technology derived from LEMUR could discover similar signs of life. Those robots are being developed now, honing technology that may one day be part of future missions to distant worlds.

[ NASA ]

This video demonstrates the autonomous footstep planning developed by IHMC. Robots in this video are the Atlas humanoid robot (DRC version) and the NASA Valkyrie. The operator specifies a goal location in the world, which is modeled as planar regions using the robot’s perception sensors. The planner then automatically computes the necessary steps to reach the goal using a Weighted A* algorithm. The algorithm does not reject footholds that have a certain amount of support, but instead modifies them after the plan is found to try and increase that support area.

Currently, narrow terrain has a success rate of about 50%, rough terrain is about 90%, whereas flat ground is near 100%. We plan on increasing planner speed and the ability to plan through mazes and to unseen goals by including a body-path planner as the first step. Control, Perception, and Planning algorithms by IHMC Robotics.

[ IHMC ]

I’ve never really been able to get into watching people play poker, but throw an AI from CMU and Facebook into a game of no-limit Texas hold’em with five humans, and I’m there.

[ Facebook ]

In this video, Cassie Blue is navigating autonomously. Right now, her world is very small, the Wavefield at the University of Michigan, where she is told to turn left at intersections. You’re right, that is not a lot of independence, but it’s a first step away from a human and an RC controller!

Using a RealSense RGBD Camera, an IMU, and our version of an InEKF with contact factors, Cassie Blue is building a 3D semantic map in real time that identifies sidewalks, grass, poles, bicycles, and buildings. From the semantic map, occupancy and cost maps are built with the sidewalk identified as walk-able area and everything else considered as an obstacle. A planner then sets a goal to stay approximately 50 cm to the right of the sidewalk’s left edge and plans a path around obstacles and corners using D*. The path is translated into way-points that are achieved via Cassie Blue’s gait controller.

[ University of Michigan ]

Thanks Jesse!

Dave from HEBI Robotics wrote in to share some new actuators that are designed to get all kinds of dirty: “The R-Series takes HEBI’s X-Series to the next level, providing a sealed robotics solution for rugged, industrial applications and laying the groundwork for industrial users to address challenges that are not well met by traditional robotics. To prove it, we shot some video right in the Allegheny River here in Pittsburgh. Not a bad way to spend an afternoon :-)”

The R-Series Actuator is a full-featured robotic component as opposed to a simple servo motor. The output rotates continuously, requires no calibration or homing on boot-up, and contains a thru-bore for easy daisy-chaining of wiring. Modular in nature, R-Series Actuators can be used in everything from wheeled robots to collaborative robotic arms. They are sealed to IP67 and designed with a lightweight form factor for challenging field applications, and they’re packed with sensors that enable simultaneous control of position, velocity, and torque.

[ HEBI Robotics ]

Thanks Dave!

If your robot hands out karate chops on purpose, that’s great. If it hands out karate chops accidentally, maybe you should fix that.

COVR is short for “being safe around collaborative and versatile robots in shared spaces”. Our mission is to significantly reduce the complexity in safety certifying cobots. Increasing safety for collaborative robots enables new innovative applications, thus increasing production and job creation for companies utilizing the technology. Whether you’re an established company seeking to deploy cobots or an innovative startup with a prototype of a cobot related product, COVR will help you analyze, test and validate the safety for that application.

[ COVR ]

Thanks Anna!

EPFL startup Flybotix has developed a novel drone with just two propellers and an advanced stabilization system that allow it to fly for twice as long as conventional models. That fact, together with its small size, makes it perfect for inspecting hard-to-reach parts of industrial facilities such as ducts.

[ Flybotix ]

SpaceBok is a quadruped robot designed and built by a Swiss student team from ETH Zurich and ZHAW Zurich, currently being tested using Automation and Robotics Laboratories (ARL) facilities at our technical centre in the Netherlands. The robot is being used to investigate the potential of ‘dynamic walking’ and jumping to get around in low gravity environments.

SpaceBok could potentially go up to 2 m high in lunar gravity, although such a height poses new challenges. Once it comes off the ground the legged robot needs to stabilise itself to come down again safely – like a mini-spacecraft. So, like a spacecraft. SpaceBok uses a reaction wheel to control its orientation.

[ ESA ]

A new video from GITAI showing progress on their immersive telepresence robot for space.

[ GITAI ]

Tech United’s HERO robot (a Toyota HSR) competed in the RoboCup@Home competition, and it had a couple of garbage-related hiccups.

[ Tech United ]

Even small drones are getting better at autonomous obstacle avoidance in cluttered environments at useful speeds, as this work from the HKUST Aerial Robotics Group shows.

[ HKUST ]

DelFly Nimbles now come in swarms.

[ DelFly Nimble ]

This is a very short video, but it’s a fairly impressive look at a Baxter robot collaboratively helping someone put a shirt on, a useful task for folks with disabilities.

[ Shibata Lab ]

ANYmal can inspect the concrete in sewers for deterioration by sliding its feet along the ground.

[ ETH Zurich ]

HUG is a haptic user interface for teleoperating advanced robotic systems as the humanoid robot Justin or the assistive robotic system EDAN. With its lightweight robot arms, HUG can measure human movements and simultaneously display forces from the distant environment. In addition to such teleoperation applications, HUG serves as a research platform for virtual assembly simulations, rehabilitation, and training.

[ DLR ]

This video about “image understanding” from CMU in 1979 (!) is amazing, and even though it’s long, you won’t regret watching until 3:30. Or maybe you will.

[ ARGOS (pdf) ]

Will Burrard-Lucas’ BeetleCam turned 10 this month, and in this video, he recounts the history of his little robotic camera.

[ BeetleCam ]

In this week’s episode of Robots in Depth, Per speaks with Gabriel Skantze from Furhat Robotics.

Gabriel Skantze is co-founder and Chief Scientist at Furhat Robotics and Professor in speech technology at KTH with a specialization in conversational systems. He has a background in research into how humans use spoken communication to interact.

In this interview, Gabriel talks about how the social robot revolution makes it necessary to communicate with humans in a human ways through speech and facial expressions. This is necessary as we expand the number of people that interact with robots as well as the types of interaction. Gabriel gives us more insight into the many challenges of implementing spoken communication for co-bots, where robots and humans work closely together. They need to communicate about the world, the objects in it and how to handle them. We also get to hear how having an embodied system using the Furhat robot head helps the interaction between humans and the system.

[ Robots in Depth ] Continue reading

Posted in Human Robots

#435662 Video Friday: This 3D-Printed ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRES 2019 – July 29-30, 2019 – London, U.K.
DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
Let us know if you have suggestions for next week, and enjoy today’s videos.

We’re used to seeing bristle bots about the size of a toothbrush head (which is not a coincidence), but Georgia Tech has downsized them, with some interesting benefits.

Researchers have created a new type of tiny 3D-printed robot that moves by harnessing vibration from piezoelectric actuators, ultrasound sources or even tiny speakers. Swarms of these “micro-bristle-bots” might work together to sense environmental changes, move materials – or perhaps one day repair injuries inside the human body.

The prototype robots respond to different vibration frequencies depending on their configurations, allowing researchers to control individual bots by adjusting the vibration. Approximately two millimeters long – about the size of the world’s smallest ant – the bots can cover four times their own length in a second despite the physical limitations of their small size.

“We are working to make the technology robust, and we have a lot of potential applications in mind,” said Azadeh Ansari, an assistant professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. “We are working at the intersection of mechanics, electronics, biology and physics. It’s a very rich area and there’s a lot of room for multidisciplinary concepts.”

[ Georgia Tech ]

Most consumer drones are “multi-copters,” meaning that they have a series of rotors or propellers that allow them to hover like helicopters. But having rotors severely limits their energy efficiency, which means that they can’t easily carry heavy payloads or fly for long periods of time. To get the best of both worlds, drone designers have tried to develop “hybrid” fixed-wing drones that can fly as efficiently as airplanes, while still taking off and landing vertically like multi-copters.

These drones are extremely hard to control because of the complexity of dealing with their flight dynamics, but a team from MIT CSAIL aims to make the customization process easier, with a new system that allows users to design drones of different sizes and shapes that can nimbly switch between hovering and gliding – all by using a single controller.

In future work, the team plans to try to further increase the drone’s maneuverability by improving its design. The model doesn’t yet fully take into account complex aerodynamic effects between the propeller’s airflow and the wings. And lastly, their method trained the copter with “yaw velocity” set at zero, which means that it cannot currently perform sharp turns.

[ Paper ] via [ MIT ]

We’re not quite at the point where we can 3D print entire robots, but UCSD is getting us closer.

The UC San Diego researchers’ insight was twofold. They turned to a commercially available printer for the job, (the Stratasys Objet350 Connex3—a workhorse in many robotics labs). In addition, they realized one of the materials used by the 3D printer is made of carbon particles that can conduct power to sensors when connected to a power source. So roboticists used the black resin to manufacture complex sensors embedded within robotic parts made of clear polymer. They designed and manufactured several prototypes, including a gripper.

When stretched, the sensors failed at approximately the same strain as human skin. But the polymers the 3D printer uses are not designed to conduct electricity, so their performance is not optimal. The 3D printed robots also require a lot of post-processing before they can be functional, including careful washing to clean up impurities and drying.

However, researchers remain optimistic that in the future, materials will improve and make 3D printed robots equipped with embedded sensors much easier to manufacture.

[ UCSD ]

Congrats to Team Homer from the University of Koblenz-Landau, who won the RoboCup@Home world championship in Sydney!

[ Team Homer ]

When you’ve got a robot with both wheels and legs, motion planning is complicated. IIT has developed a new planner for CENTAURO that takes advantage of the different ways that the robot is able to get past obstacles.

[ Centauro ]

Thanks Dimitrios!

If you constrain a problem tightly enough, you can solve it even with a relatively simple robot. Here’s an example of an experimental breakfast robot named “Loraine” that can cook eggs, bacon, and potatoes using what looks to be zero sensing at all, just moving to different positions and actuating its gripper.

There’s likely to be enough human work required in the prep here to make the value that the robot adds questionable at best, but it’s a good example of how you can make a relatively complex task robot-compatible as long as you set it up in just the right way.

[ Connected Robotics ] via [ RobotStart ]

It’s been a while since we’ve seen a ball bot, and I’m not sure that I’ve ever seen one with a manipulator on it.

[ ETH Zurich RSL ]

Soft Robotics’ new mini fingers are able to pick up taco shells without shattering them, which as far as I can tell is 100 percent impossible for humans to do.

[ Soft Robotics ]

Yes, Starship’s wheeled robots can climb curbs, and indeed they have a pretty neat way of doing it.

[ Starship ]

Last year we posted a long interview with Christoph Bartneck about his research into robots and racism, and here’s a nice video summary of the work.

[ Christoph Bartneck ]

Canada’s contribution to the Lunar Gateway will be a smart robotic system which includes a next-generation robotic arm known as Canadarm3, as well as equipment, and specialized tools. Using cutting-edge software and advances in artificial intelligence, this highly-autonomous system will be able to maintain, repair and inspect the Gateway, capture visiting vehicles, relocate Gateway modules, help astronauts during spacewalks, and enable science both in lunar orbit and on the surface of the Moon.

[ CSA ]

An interesting demo of how Misty can integrate sound localization with other services.

[ Misty Robotics ]

The third and last period of H2020 AEROARMS project has brought the final developments in industrial inspection and maintenance tasks, such as the crawler retrieval and deployment (DLR) or the industrial validation in stages like a refinery or a cement factory.

[ Aeroarms ]

The Guardian S remote visual inspection and surveillance robot navigates a disaster training site to demonstrate its advanced maneuverability, long-range wireless communications and extended run times.

[ Sarcos ]

This appears to be a cake frosting robot and I wish I had like 3 more hours of this to share:

Also here is a robot that picks fried chicken using a curiously successful technique:

[ Kazumichi Moriyama ]

This isn’t strictly robots, but professor Hiroshi Ishii, associate director of the MIT Media Lab, gave a fascinating SIGCHI Lifetime Achievement Talk that’s absolutely worth your time.

[ Tangible Media Group ] Continue reading

Posted in Human Robots

#435646 Video Friday: Kiki Is a New Social Robot ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

DARPA SubT Tunnel Circuit – August 15-22, 2019 – Pittsburgh, Pa., USA
IEEE Africon 2019 – September 25-27, 2019 – Accra, Ghana
ISRR 2019 – October 6-10, 2019 – Hanoi, Vietnam
Ro-Man 2019 – October 14-18, 2019 – New Delhi, India
Humanoids 2019 – October 15-17, 2019 – Toronto, Canada
ARSO 2019 – October 31-1, 2019 – Beijing, China
ROSCon 2019 – October 31-1, 2019 – Macau
Let us know if you have suggestions for next week, and enjoy today’s videos.

The DARPA Subterranean Challenge tunnel circuit takes place in just a few weeks, and we’ll be there!

[ DARPA SubT ]

Time lapse video of robotic arm on NASA’s Mars 2020 rover handily maneuvers 88-pounds (40 kilograms) worth of sensor-laden turret as it moves from a deployed to stowed configuration.

If you haven’t read our interview with Matt Robinson, now would be a great time, since he’s one of the folks at JPL who designed this arm.

[ Mars 2020 ]

Kiki is a small, white, stationary social robot with an evolving personality who promises to be your friend and costs $800 and is currently on Kickstarter.

The Kickstarter page is filled with the same type of overpromising that we’ve seen with other (now very dead) social robots: Kiki is “conscious,” “understands your feelings,” and “loves you back.” Oof. That said, we’re happy to see more startups trying to succeed in this space, which is certainly one of the toughest in consumer electronics, and hopefully they’ve been learning from the recent string of failures. And we have to say Kiki is a cute robot. Its overall design, especially the body mechanics and expressive face, look neat. And kudos to the team—the company was founded by two ex-Googlers, Mita Yun and Jitu Das—for including the “unedited prototype videos,” which help counterbalance the hype.

Another thing that Kiki has going for it is that everything runs on the robot itself. This simplifies privacy and means that the robot won’t partially die on you if the company behind it goes under, but also limits how clever the robot will be able to be. The Kickstarter campaign is already over a third funded, so…We’ll see.

[ Kickstarter ]

When your UAV isn’t enough UAV, so you put a UAV on your UAV.

[ CanberraUAV ]

ABB’s YuMi is testing ATMs because a human trying to do this task would go broke almost immediately.

[ ABB ]

DJI has a fancy new FPV system that features easy setup, digital HD streaming at up to 120 FPS, and <30ms latency.

If it looks expensive, that’s because it costs $930 with the remote included.

[ DJI ]

Honeybee Robotics has recently developed a regolith excavation and rock cleaning system for NASA JPL’s PUFFER rovers. This system, called POCCET (PUFFER-Oriented Compact Cleaning and Excavation Tool), uses compressed gas to perform all excavation and cleaning tasks. Weighing less than 300 grams with potential for further mass reduction, POCCET can be used not just on the Moon, but on other Solar System bodies such as asteroids, comets, and even Mars.

[ Honeybee Robotics ]

DJI’s 2019 RoboMaster tournament, which takes place this month in Shenzen, looks like it’ll be fun to watch, with a plenty of action and rules that are easy to understand.

[ RoboMaster ]

Robots and baked goods are an automatic Video Friday inclusion.

Wow I want a cupcake right now.

[ Soft Robotics ]

The ICRA 2019 Best Paper Award went to Michelle A. Lee at Stanford, for “Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks.”

The ICRA video is here, and you can find the paper at the link below.

[ Paper ] via [ RoboHub ]

Cobalt Robotics put out a bunch of marketing-y videos this week, but this one reasonably interesting, even if you’re familiar with what they’re doing over there.

[ Cobalt Robotics ]

RightHand Robotics launched RightPick2 with a gala event which looked like fun as long as you were really, really in to robots.

[ RightHand Robotics ]

Thanks Jeff!

This video presents a framework for whole-body control applied to the assistive robotic system EDAN. We show how the proposed method can be used for a task like open, pass through and close a door. Also, we show the efficiency of the whole-body coordination with controlling the end-effector with respect to a fixed reference. Additionally, showing how easy the system can be manually manoeuvred by direct interaction with the end-effector, without the need for an extra input device.

[ DLR ]

You’ll probably need to turn on auto-translated subtitles for most of this, but it’s worth it for the adorable little single-seat robotic car designed to help people get around airports.

[ ZMP ]

In this week’s episode of Robots in Depth, Per speaks with Gonzalo Rey from Moog about their fancy 3D printed integrated hydraulic actuators.

Gonzalo talks about how Moog got started with hydraulic control,taking part in the space program and early robotics development. He shares how Moog’s technology is used in fly-by-wire systems in aircraft and in flow control in deep space probes. They have even reached Mars.

[ Robots in Depth ] Continue reading

Posted in Human Robots