Tag Archives: your
#439198 Video Friday: A Robot to Brush Your Hair
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
ICRA 2021 – May 30-5, 2021 – [Online Event]
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 20201 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.
With rapidly growing demands on health care systems, nurses typically spend 18 to 40 percent of their time performing direct patient care tasks, oftentimes for many patients and with little time to spare. Personal care robots that brush your hair could provide substantial help and relief.
While the hardware set-up looks futuristic and shiny, the underlying model of the hair fibers is what makes it tick. CSAIL postdoc Josie Hughes and her team’s approach examined entangled soft fiber bundles as sets of entwined double helices – think classic DNA strands. This level of granularity provided key insights into mathematical models and control systems for manipulating bundles of soft fibers, with a wide range of applications in the textile industry, animal care, and other fibrous systems.
[ MIT CSAIL ]
Sometimes CIA needs to get creative when collecting intelligence. Charlie, for instance, is a robotic catfish that collects water samples. While never used operationally, the unmanned underwater vehicle (UUV) fish was created to study aquatic robot technology.
[ CIA ]
It's really just a giant drone, even if it happens to be powered by explosions.
[ SpaceX ]
Somatic's robot will clean your bathrooms for 40 hours a week and will cost you just $1,000 a month. It looks like it works quite well, as long as your bathrooms are the normal level of gross as opposed to, you know, super gross.
[ Somatic ]
NASA’s Ingenuity Mars Helicopter successfully completed a fourth, more challenging flight on the Red Planet on April 30, 2021. Flight Test No. 4 aimed for a longer flight time, longer distance, and more image capturing to begin to demonstrate its ability to serve as a scout on Mars. Ingenuity climbed to an altitude of 16 feet (5 meters) before flying south and back for an 872-foot (266-meter) round trip. In total, Ingenuity was in the air for 117 seconds, another set of records for the helicopter.
[ Ingenuity ]
The Perseverance rover is all new and shiny, but let's not forget about Curiosity, still hard at work over in Gale crater.
NASA’s Curiosity Mars rover took this 360-degree panorama while atop “Mont Mercou,” a rock formation that offered a view into Gale Crater below. The panorama is stitched together from 132 individual images taken on April 15, 2021, the 3,090th Martian day, or sol, of the mission. The panorama has been white-balanced so that the colors of the rock materials resemble how they would appear under daytime lighting conditions on Earth. Images of the sky and rover hardware were not included in this terrain mosaic.
[ MSL ]
Happy Star Wars Day from Quanser!
[ Quanser ]
Thanks Arman!
Lingkang Zhang's 12 DOF Raspberry Pi-powered quadruped robot, Yuki Mini, is complete!
Adorable, right? It runs ROS and the hardware is open source as well.
[ Yuki Mini ]
Thanks Lingkang!
Honda and AutoX have been operating a fully autonomous, no safety driver taxi service in China for a couple of months now.
If you thought SF was hard, well, I feel like this is even harder.
[ AutoX ]
This is the kind of drone delivery that I can get behind.
[ WeRobotics ]
The Horizon 2020 EU-funded PRO-ACT project will aim to develop and demonstrate a cooperation and manipulation capabilities between three robots for assembling an in-situ resource utilisation (ISRU) plant. PRO-ACT will show how robot working agents, or RWAs, can work together collaboratively to achieve a common goal.
[ Pro-Act ]
Thanks Fan!
This brief quadruped simulation video, from Jerry Pratt at IHMC, dates back to 2003 (!).
[ IHMC ]
Extend Robotics' vision is to extend human capability beyond physical presence. We build affordable robotic arms capable of remote operation from anywhere in the world, using cloud-based teleoperation software.
[ Extend Robotics ]
Meet Maria Vittoria Minniti, robotics engineer and PhD student at NCCR Digital Fabrication and ETH Zurich. Maria Vittoria makes it possible for simple robots to do complicated things.
[ NCCR Women ]
Thanks Fan!
iCub has been around for 10 years now, and it's almost like it hasn't gotten any taller! This IFRR Robotics Global Colloquium celebrates the past decade of iCub.
[ iCub ]
This CMU RI Seminar is by Cynthia Sung from UPenn, on Dynamical Robots via Origami-Inspired Design.
Origami-inspired engineering produces structures with high strength-to-weight ratios and simultaneously lower manufacturing complexity. This reliable, customizable, cheap fabrication and component assembly technology is ideal for robotics applications in remote, rapid deployment scenarios that require platforms to be quickly produced, reconfigured, and deployed. Unfortunately, most examples of folded robots are appropriate only for small-scale, low-load applications. In this talk, I will discuss efforts in my group to expand origami-inspired engineering to robots with the ability to withstand and exert large loads and to execute dynamic behaviors.
[ CMU RI ]
How can feminist methodologies and approaches be applied and be transformative when developing AI and ADM systems? How can AI innovation and social systems innovation be catalyzed concomitantly to create a positive movement for social change larger than the sum of the data science or social science parts? How can we produce actionable research that will lead to the profound changes needed—from scratch—in the processes to produce AI? In this seminar, 2020 CCSRE Race and Technology Practitioner Fellow Renata Avila discusses ideas and experiences from different disciplines that could help draft a blueprint for a better modeled digital future.
[ CMU RI ] Continue reading
#438286 Humanoids that’ll blow your mind!
Here, the PRO Robots Channel highlights five of the most advanced humanoid robots.
#439110 Robotic Exoskeletons Could One Day Walk ...
Engineers, using artificial intelligence and wearable cameras, now aim to help robotic exoskeletons walk by themselves.
Increasingly, researchers around the world are developing lower-body exoskeletons to help people walk. These are essentially walking robots users can strap to their legs to help them move.
One problem with such exoskeletons: They often depend on manual controls to switch from one mode of locomotion to another, such as from sitting to standing, or standing to walking, or walking on the ground to walking up or down stairs. Relying on joysticks or smartphone apps every time you want to switch the way you want to move can prove awkward and mentally taxing, says Brokoslaw Laschowski, a robotics researcher at the University of Waterloo in Canada.
Scientists are working on automated ways to help exoskeletons recognize when to switch locomotion modes — for instance, using sensors attached to legs that can detect bioelectric signals sent from your brain to your muscles telling them to move. However, this approach comes with a number of challenges, such as how how skin conductivity can change as a person’s skin gets sweatier or dries off.
Now several research groups are experimenting with a new approach: fitting exoskeleton users with wearable cameras to provide the machines with vision data that will let them operate autonomously. Artificial intelligence (AI) software can analyze this data to recognize stairs, doors, and other features of the surrounding environment and calculate how best to respond.
Laschowski leads the ExoNet project, the first open-source database of high-resolution wearable camera images of human locomotion scenarios. It holds more than 5.6 million images of indoor and outdoor real-world walking environments. The team used this data to train deep-learning algorithms; their convolutional neural networks can already automatically recognize different walking environments with 73 percent accuracy “despite the large variance in different surfaces and objects sensed by the wearable camera,” Laschowski notes.
According to Laschowski, a potential limitation of their work their reliance on conventional 2-D images, whereas depth cameras could also capture potentially useful distance data. He and his collaborators ultimately chose not to rely on depth cameras for a number of reasons, including the fact that the accuracy of depth measurements typically degrades in outdoor lighting and with increasing distance, he says.
In similar work, researchers in North Carolina had volunteers with cameras either mounted on their eyeglasses or strapped onto their knees walk through a variety of indoor and outdoor settings to capture the kind of image data exoskeletons might use to see the world around them. The aim? “To automate motion,” says Edgar Lobaton an electrical engineering researcher at North Carolina State University. He says they are focusing on how AI software might reduce uncertainty due to factors such as motion blur or overexposed images “to ensure safe operation. We want to ensure that we can really rely on the vision and AI portion before integrating it into the hardware.”
In the future, Laschowski and his colleagues will focus on improving the accuracy of their environmental analysis software with low computational and memory storage requirements, which are important for onboard, real-time operations on robotic exoskeletons. Lobaton and his team also seek to account for uncertainty introduced into their visual systems by movements .
Ultimately, the ExoNet researchers want to explore how AI software can transmit commands to exoskeletons so they can perform tasks such as climbing stairs or avoiding obstacles based on a system’s analysis of a user's current movements and the upcoming terrain. With autonomous cars as inspiration, they are seeking to develop autonomous exoskeletons that can handle the walking task without human input, Laschowski says.
However, Laschowski adds, “User safety is of the utmost importance, especially considering that we're working with individuals with mobility impairments,” resulting perhaps from advanced age or physical disabilities.
“The exoskeleton user will always have the ability to override the system should the classification algorithm or controller make a wrong decision.” Continue reading