Tag Archives: sensors

#437826 Video Friday: Skydio 2 Drone Is Back on ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Skydio, which makes what we’re pretty sure is the most intelligent consumer drone (or maybe just drone period) in existence, has been dealing with COVID-19 just like the rest of us. Even so, they’ve managed to push out a major software update, and pre-orders for the Skydio 2 are now open again.

If you think you might want one, read our review, after which you’ll be sure you want one.

[ Skydio ]

Worried about people with COVID entering your workplace? Misty II has your front desk covered, in a way that’s quite a bit friendlier than many other options.

Misty II provides a dynamic and interactive screening experience that delivers a joyful experience in an otherwise depressing moment while also delivering state of the art thermal scanning and health screening. We have already found that employees, customers, and visitors appreciate the novelty of interacting with a clever and personable robot. Misty II engages dynamically, both visually and verbally. Companies appreciate using a solution with a blackbody-referenced thermal camera that provides high accuracy and a short screening process for efficiency. Putting a robot to work in this role shifts not only how people look at the screening process but also how robots can take on useful assignments in business, schools and homes.

[ Misty Robotics ]

Thanks Tim!

I’m definitely the one in the middle.

[ Agility Robotics ]

NASA’s Ingenuity helicopter is traveling to Mars attached to the belly of the Perseverance rover and must safely detach to begin the first attempt at powered flight on another planet. Tests done at NASA’s Jet Propulsion Laboratory and Lockheed Martin Space show the sequence of events that will bring the helicopter down to the Martian surface.

[ JPL ]

Here’s a sequence of videos of Cassie Blue making it (or mostly making it) up a 22-degree slope.

My mood these days is Cassie at 1:09.

[ University of Michigan ]

Thanks Jesse!

This is somewhere on the line between home automation and robotics, but it’s a cool idea: A baby crib that “uses computer vision and machine learning to recognize subtle changes” in an infant’s movement, and proactively bounces them to keep them sleeping peacefully.

It costs $1000, but how much value do you put on 24 months of your own sleep?

[ Cradlewise ]

Thanks Ben!

As captive marine mammal shows have fallen from favor; and the catching, transporting and breeding of marine animals has become more restricted, the marine park industry as a viable business has become more challenging – yet the audience appetite for this type of entertainment and education has remained constant.

Real-time Animatronics provide a way to reinvent the marine entertainment industry with a sustainable, safe, and profitable future. Show venues include aquariums, marine parks, theme parks, fountain shows, cruise lines, resort hotels, shopping malls, museums, and more.

[ EdgeFX ] via [ Gizmodo ]

Robotic cabling is surprisingly complex and kinda cool to watch.

The video shows the sophisticated robot application “Automatic control cabinet cabling”, which Fraunhofer IPA implemented together with the company Rittal. The software pitasc, developed at Fraunhofer IPA, is used for force-controlled assembly processes. Two UR robot arms carry out the task together. The modular pitasc system enables the robot arms to move and rotate in parallel. They work hand in hand, with one robot holding the cable and the second bringing it to the starting position for the cabling. The robots can find, tighten, hold ready, lay, plug in, fix, move freely or immerse cables. They can also perform push-ins and pull tests.

[ Fraunhofer ]

This is from 2018, but the concept is still pretty neat.

We propose to perform a novel investigation into the ability of a propulsively hopping robot to reach targets of high science value on the icy, rugged terrains of Ocean Worlds. The employment of a multi-hop architecture allows for the rapid traverse of great distances, enabling a single mission to reach multiple geologic units within a timespan conducive to system survival in a harsh radiation environment. We further propose that the use of a propulsive hopping technique obviates the need for terrain topographic and strength assumptions and allows for complete terrain agnosticism; a key strength of this concept.

[ NASA ]

Aerial-aquatic robots possess the unique ability of operating in both air and water. However, this capability comes with tremendous challenges, such as communication incompati- bility, increased airborne mass, potentially inefficient operation in each of the environments and manufacturing difficulties. Such robots, therefore, typically have small payloads and a limited operational envelope, often making their field usage impractical. We propose a novel robotic water sampling approach that combines the robust technologies of multirotors and underwater micro-vehicles into a single integrated tool usable for field operations.

[ Imperial ]

Event cameras are bio-inspired vision sensors with microsecond latency resolution, much larger dynamic range and hundred times lower power consumption than standard cameras. This 20-minute talk gives a short tutorial on event cameras and show their applications on computer vision, drones, and cars.

[ UZH ]

We interviewed Paul Newman, Perla Maiolino and Lars Kunze, ORI academics, to hear what gets them excited about robots in the future and any advice they have for those interested in the field.

[ Oxford Robotics Institute ]

Two projects from the Rehabilitation Engineering Lab at ETH Zurich, including a self-stabilizing wheelchair and a soft exoskeleton for grasping assistance.

[ ETH Zurich ]

Silicon Valley Robotics hosted an online conversation about robotics and racism. Moderated by Andra Keay, the panel featured Maynard Holliday, Tom Williams, Monroe Kennedy III, Jasmine Lawrence, Chad Jenkins, and Ken Goldberg.

[ SVR ]

The ICRA Legged Locomotion workshop has been taking place online, and while we’re not getting a robot mosh pit, there are still some great talks. We’ll post two here, but for more, follow the legged robots YouTube channel at the link below.

[ YouTube ] Continue reading

Posted in Human Robots

#437820 In-Shoe Sensors and Mobile Robots Keep ...

In shoe sensor

Researchers at Stevens Institute of Technology are leveraging some of the newest mechanical and robotic technologies to help some of our oldest populations stay healthy, active, and independent.

Yi Guo, professor of electrical and computer engineering and director of the Robotics and Automation Laboratory, and Damiano Zanotto, assistant professor of mechanical engineering, and director of the Wearable Robotic Systems Laboratory, are collaborating with Ashley Lytle, assistant professor in Stevens’ College of Arts and Letters, and Ashwini K. Rao of Columbia University Medical Center, to combine an assistive mobile robot companion with wearable in-shoe sensors in a system designed to help elderly individuals maintain the balance and motion they need to thrive.

“Balance and motion can be significant issues for this population, and if elderly people fall and experience an injury, they are less likely to stay fit and exercise,” Guo said. “As a consequence, their level of fitness and performance decreases. Our mobile robot companion can help decrease the chances of falling and contribute to a healthy lifestyle by keeping their walking function at a good level.”

The mobile robots are designed to lead walking sessions and using the in-shoe sensors, monitor the user’s gait, indicate issues, and adjust the exercise speed and pace. The initiative is part of a four-year National Science Foundation research project.

“For the first time, we’re integrating our wearable sensing technology with an autonomous mobile robot,” said Zanotto, who worked with elderly people at Columbia University Medical Center for three years before coming to Stevens in 2016. “It’s exciting to be combining these different areas of expertise to leverage the strong points of wearable sensing technology, such as accurately capturing human movement, with the advantages of mobile robotics, such as much larger computational powers.”

The team is developing algorithms that fuse real-time data from smart, unobtrusive, in-shoe sensors and advanced on-board sensors to inform the robot’s navigation protocols and control the way the robot interacts with elderly individuals. It’s a promising way to assist seniors in safely doing walking exercises and maintaining their quality of life.

Bringing the benefits of the lab to life

Guo and Zanotto are working with Lytle, an expert in social and health psychology, to implement a social connectivity capability and make the bi-directional interaction between human and robot even more intuitive, engaging, and meaningful for seniors.

“Especially during COVID, it’s important for elderly people living on their own to connect socially with family and friends,” Zanotto said, “and the robot companion will also offer teleconferencing tools to provide that interaction in an intuitive and transparent way.”

“We want to use the robot for social connectedness, perhaps integrating it with a conversation agent such as Alexa,” Guo added. “The goal is to make it a companion robot that can sense, for example, that you are cooking, or you’re in the living room, and help with things you would do there.”

It’s a powerful example of how abstract concepts can have meaningful real-life benefits.

“As engineers, we tend to work in the lab, trying to optimize our algorithms and devices and technologies,” Zanotto noted, “but at the end of the day, what we do has limited value unless it has impact on real life. It’s fascinating to see how the devices and technologies we’re developing in the lab can be applied to make a difference for real people.”

Maintaining balance in a global pandemic

Although COVID-19 has delayed the planned testing at a senior center in New York City, it has not stopped the team’s progress.

“Although we can’t test on elderly populations yet, our students are still testing in the lab,” Guo said. “This summer and fall, for the first time, the students validated the system’s real-time ability to monitor and assess the dynamic margin of stability during walking—in other words, to evaluate whether the person following the robot is walking normally or has a risk of falling. They’re also designing parameters for the robot to give early warnings and feedback that help the human subjects correct posture and gait issues while walking.”

Those warnings would be literally underfoot, as the in-shoe sensors would pulse like a vibrating cell phone to deliver immediate directional information to the subject.

“We’re not the first to use this vibrotactile stimuli technology, but this application is new,” Zanotto said.

So far, the team has published papers in top robotics publication venues including IEEE Transactions on Neural Systems and Rehabilitation Engineering and the 2020 IEEE International Conference on Robotics and Automation (ICRA). It’s a big step toward realizing the synergies of bringing the technical expertise of engineers to bear on the clinical focus on biometrics—and the real lives of seniors everywhere. Continue reading

Posted in Human Robots

#437807 Why We Need Robot Sloths

An inherent characteristic of a robot (I would argue) is embodied motion. We tend to focus on motion rather a lot with robots, and the most dynamic robots get the most attention. This isn’t to say that highly dynamic robots don’t deserve our attention, but there are other robotic philosophies that, while perhaps less visually exciting, are equally valuable under the right circumstances. Magnus Egerstedt, a robotics professor at Georgia Tech, was inspired by some sloths he met in Costa Rica to explore the idea of “slowness as a design paradigm” through an arboreal robot called SlothBot.

Since the robot moves so slowly, why use a robot at all? It may be very energy-efficient, but it’s definitely not more energy efficient than a static sensing system that’s just bolted to a tree or whatever. The robot moves, of course, but it’s also going to be much more expensive (and likely much less reliable) than a handful of static sensors that could cover a similar area. The problem with static sensors, though, is that they’re constrained by power availability, and in environments like under a dense tree canopy, you’re not going to be able to augment their lifetime with solar panels. If your goal is a long-duration study of a small area (over weeks or months or more), SlothBot is uniquely useful in this context because it can crawl out from beneath a tree to find some sun to recharge itself, sunbathe for a while, and then crawl right back again to resume collecting data.

SlothBot is such an interesting concept that we had to check in with Egerstedt with a few more questions.

IEEE Spectrum: Tell us what you find so amazing about sloths!

Magnus Egerstedt: Apart from being kind of cute, the amazing thing about sloths is that they have carved out a successful ecological niche for themselves where being slow is not only acceptable but actually beneficial. Despite their pretty extreme low-energy lifestyle, they exhibit a number of interesting and sometimes outright strange behaviors. And, behaviors having to do with territoriality, foraging, or mating look rather different when you are that slow.

Are you leveraging the slothiness of the design for this robot somehow?

Sadly, the sloth design serves no technical purpose. But we are also viewing the SlothBot as an outreach platform to get kids excited about robotics and/or conservation biology. And having the robot look like a sloth certainly cannot hurt.

“Slowness is ideal for use cases that require a long-term, persistent presence in an environment, like for monitoring tasks. I can imagine slow robots being out on farm fields for entire growing cycles, or suspended on the ocean floor keeping track of pollutants or temperature variations.”
—Magnus Egerstedt, Georgia Tech

Can you talk more about slowness as a design paradigm?

The SlothBot is part of a broader design philosophy that I have started calling “Robot Ecology.” In ecology, the connections between individuals and their environments/habitats play a central role. And the same should hold true in robotics. The robot design must be understood in the environmental context in which it is to be deployed. And, if your task is to be present in a slowly varying environment over a long time scale, being slow seems like the right way to go. Slowness is ideal for use cases that require a long-term, persistent presence in an environment, like for monitoring tasks, where the environment itself is slowly varying. I can imagine slow robots being out on farm fields for entire growing cycles, or suspended on the ocean floor keeping track of pollutants or temperature variations.

How do sloths inspire SlothBot’s functionality?

Its motions are governed by what we call survival constraints. These constraints ensure that the SlothBot is always able to get to a sunny spot to recharge. The actual performance objective that we have given to the robot is to minimize energy consumption, i.e., to simply do nothing subject to the survival constraints. The majority of the time, the robot simply sits there under the trees, measuring various things, seemingly doing absolutely nothing and being rather sloth-like. Whenever the SlothBot does move, it does not move according to some fixed schedule. Instead, it moves because it has to in order to “survive.”

How would you like to improve SlothBot?

I have a few directions I would like to take the SlothBot. One is to make the sensor suites richer to make sure that it can become a versatile and useful science instrument. Another direction involves miniaturization – I would love to see a bunch of small SlothBots “living” among the trees somewhere in a rainforest for years, providing real-time data as to what is happening to the ecosystem. Continue reading

Posted in Human Robots

#437789 Video Friday: Robotic Glove Features ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Evidently, the folks at Unitree were paying attention to last week’s Video Friday.

[ Unitree ]

RoboSoft 2020 was a virtual conference this year (along with everything else), but they still held a soft robots contest, and here are four short vids—you can watch the rest of them here.

[ RoboSoft 2020 ]

If you were wondering why SoftBank bought Aldebaran Robotics and Boston Dynamics, here’s the answer.

I am now a Hawks fan. GO HAWKS!

[ Softbank Hawks ] via [ RobotStart ]

Scientists at the University of Liverpool have developed a fully autonomous mobile robot to assist them in their research. Using a type of AI, the robot has been designed to work uninterrupted for weeks at a time, allowing it to analyse data and make decisions on what to do next. Using a flexible arm with customised gripper it can be calibrated to interact with most standard lab equipment and machinery as well as navigate safely around human co-workers and obstacles.

[ Nature ]

Oregon State’s Cassie has been on break for a couple of months, but it’s back in the lab and moving alarmingly quickly.

[ DRL ]

The current situation linked to COVID-19 sadly led to the postponing of this year RoboCup 2020 at Bordeaux. As an official sponsor of The RoboCup, SoftBank Robotics wanted to take this opportunity to thank all RoboCupers and The RoboCup Federation for their support these past 13 years. We invite you to take a look at NAO’s adventure at The RoboCup as the official robot of the Standard Platform League. See you in Bordeaux 2021!

[ RoboCup 2021 ]

Miniature SAW robot crawling inside the intestines of a pig. You’re welcome.

[ Zarrouk Lab ]

The video demonstrates fast autonomous flight experiments in cluttered unknown environments, with the support of a robust and perception-aware replanning framework called RAPTOR. The associated paper is submitted to TRO.

[ HKUST ]

Since we haven’t gotten autonomy quite right yet, there’s a lot of telepresence going on for robots that operate in public spaces. Usually, you’ve got one remote human managing multiple robots, so it would be nice to make that interface a little more friendly, right?

[ HCI Lab ]

Arguable whether or not this is a robot, but it’s cool enough to spend a minute watching.

[ Ishikawa Lab ]

Communication is critical to collaboration; however, too much of it can degrade performance. Motivated by the need for effective use of a robot’s communication modalities, in this work, we present a computational framework that decides if, when, and what to communicate during human-robot collaboration.

[ Interactive Robotics ]

Robotiq has released the next generation of the grippers for collaborative robots: the 2F-85 and 2F-140. Both models gain greater robustness, safety, and customizability while retaining the same key benefits that have inspired thousands of manufacturers to choose them since their launch 6 years ago.

[ Robotiq ]

ANYmal C, the autonomous legged robot designed for industrial challenging environments, provides the mobility, autonomy and inspection intelligence to enable safe and efficient inspection operations. In this virtual showcase, discover how ANYmal climbs stairs, recovers from a fall, performs an autonomous mission and avoids obstacles, docks to charge by itself, digitizes analogue sensors and monitors the environment.

[ ANYbotics ]

At Waymo, we are committed to addressing inequality, and we believe listening is a critical first step toward driving positive change. Earlier this year, five Waymonauts sat down to share their thoughts on equity at work, challenging the status quo, and more. This is what they had to say.

[ Waymo ]

Nice of ABB to take in old robots and upgrade them to turn them into new robots again. Robots forever!

[ ABB ]

It’s nice seeing the progress being made by GITAI, one of the teams competing in the ANA Avatar XPRIZE Challenge, and also meet the humans behind the robots.

[ GITAI ] via [ XPRIZE ]

One more talk from the ICRA Legged Robotics Workshop: Jingyu Liu from DeepRobotics and Qiuguo Zhu from Zhejiang University.

[ Deep Robotics ] Continue reading

Posted in Human Robots

#437776 Video Friday: This Terrifying Robot Will ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.

The Aigency, which created the FitBot launch video below, is “the world’s first talent management resource for robotic personalities.”

Robots will be playing a bigger role in our lives in the future. By learning to speak their language and work with them now, we can make this future better for everybody. If you’re a creator that’s producing content to entertain and educate people, robots can be a part of that. And we can help you. Robotic actors can show up alongside the rest of your actors.

The folks at Aigency have put together a compilation reel of clips they’ve put on TikTok, which is nice of them, because some of us don’t know how to TikTok because we’re old and boring.

Do googly eyes violate the terms and conditions?

[ Aigency ]

Shane Wighton of the “Stuff Made Here” YouTube channel, who you might remember from that robotic basketball hoop, has a new invention: A haircut robot. This is not the the first barber bot, but previous designs typically used hair clippers. Shane wanted his robot to use scissors. Hilarious and terrifying at once.

[ Stuff Made Here ]

Starting in October of 2016, Prof. Charlie Kemp and Henry M. Clever invented a new kind of robot. They named the prototype NewRo. In March of 2017, Prof. Kemp filmed this video of Henry operating NewRo to perform a number of assistive tasks. While visiting the Bay Area for a AAAI Symposium workshop at Stanford, Prof. Kemp showed this video to a select group of people to get advice, including Dr. Aaron Edsinger. In August of 2017, Dr. Edsinger and Dr. Kemp founded Hello Robot Inc. to commercialize this patent pending assistive technology. Hello Robot Inc. licensed the intellectual property (IP) from Georgia Tech. After three years of stealthy effort, Hello Robot Inc. revealed Stretch, a new kind of robot!

[ Georgia Tech ]

NASA’s Ingenuity Mars Helicopter will make history's first attempt at powered flight on another planet next spring. It is riding with the agency's next mission to Mars (the Mars 2020 Perseverance rover) as it launches from Cape Canaveral Air Force Station later this summer. Perseverance, with Ingenuity attached to its belly, will land on Mars February 18, 2021.

[ JPL ]

For humans, it can be challenging to manipulate thin flexible objects like ropes, wires, or cables. But if these problems are hard for humans, they are nearly impossible for robots. As a cable slides between the fingers, its shape is constantly changing, and the robot’s fingers must be constantly sensing and adjusting the cable’s position and motion. A group of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and from the MIT Department of Mechanical Engineering pursued the task from a different angle, in a manner that more closely mimics us humans. The team’s new system uses a pair of soft robotic grippers with high-resolution tactile sensors (and no added mechanical constraints) to successfully manipulate freely moving cables.

The team observed that it was difficult to pull the cable back when it reached the edge of the finger, because of the convex surface of the GelSight sensor. Therefore, they hope to improve the finger-sensor shape to enhance the overall performance. In the future, they plan to study more complex cable manipulation tasks such as cable routing and cable inserting through obstacles, and they want to eventually explore autonomous cable manipulation tasks in the auto industry.

[ MIT ]

Gripping robots typically have troubles grabbing transparent or shiny objects. A new technique by Carnegie Mellon University relies on color camera system and machine learning to recognize shapes based on color.

[ CMU ]

A new robotic prosthetic leg prototype offers a more natural, comfortable gait while also being quieter and more energy efficient than other designs. The key is the use of new small and powerful motors with fewer gears, borrowed from the space industry. This streamlined technology enables a free-swinging knee and regenerative braking, which charges the battery during use with energy that would typically be dissipated when the foot hits the ground. This feature enables the leg to more than double a typical prosthetic user's walking needs with one charge per day.

[ University of Michigan ]

Thanks Kate!

This year’s Wonder League teams have been put to the test not only with the challenges set forth by Wonder Workshop and Cartoon Network as they look to help the creek kids from Craig of the Creek solve the greatest mystery of all – the quest for the Lost Realm but due to forces outside their control. With a global pandemic displacing many teams from one another due to lockdowns and quarantines, these teams continued to push themselves to find new ways to work together, solve problems, communicate more effectively, and push themselves to complete a journey that they started and refused to give up on. We at Wonder Workshop are humbled and in awe of all these teams have accomplished.

[ Wonder Workshop ]

Thanks Nicole!

Meet Colin Creager, a mechanical engineer at NASA's Glenn Research Center. Colin is focusing on developing tires that can be used on other worlds. These tires use coil springs made of a special shape memory alloy that will let rovers move across sharp jagged rocks or through soft sand on the Moon or Mars.

[ NASA ]

To be presented at IROS this year, “the first on robot collision detection system using low cost microphones.”

[ Rutgers ]

Robot and mechanism designs inspired by the art of Origami have the potential to generate compact, deployable, lightweight morphing structures, as seen in nature, for potential applications in search-and-rescue, aerospace systems, and medical devices. However, it is challenging to obtain actuation that is easily patternable, reversible, and made with a scalable manufacturing process for origami-inspired self-folding machines. In this work, we describe an approach to design reversible self-folding machines using liquid crystal elastomer (LCE), that contracts when heated, as an artificial muscle.

[ UCSD ]

Just in case you need some extra home entertainment, and you’d like cleaner floors at the same time.

[ iRobot ]

Sure, toss it from a drone. Or from orbit. Whatever, it’s squishy!

[ Squishy Robotics ]

The [virtual] RSS conference this week featured an excellent lineup of speakers and panels, and the best part about it being virtual is that you can watch them all at your leisure! Here’s what’s been posted so far:

[ RSS 2020 ]

Lockheed Martin Robotics Seminar: Toward autonomous flying insect-sized robots: recent results in fabrication, design, power systems, control, and sensing with Sawyer Fuller.

[ UMD ]

In this episode of the AI Podcast, Lex interviews Sergey Levine.

[ AI Podcast ] Continue reading

Posted in Human Robots