Tag Archives: dynamic

#437864 Video Friday: Jet-Powered Flying ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRA 2020 – June 1-15, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

ICRA 2020, the world’s best, biggest, longest virtual robotics conference ever, kicked off last Sunday with an all-star panel on a critical topic: “COVID-19: How Can Roboticists Help?”

Watch other ICRA keynotes on IEEE.tv.

We’re getting closer! Well, kinda. iRonCub, the jet-powered flying humanoid, is still a simulation for now, but not only are the simulations getting better—the researchers have begun testing real jet engines!

This video shows the latest results on Aerial Humanoid Robotics obtained by the Dynamic Interaction Control Lab at the Italian Institute of Technology. The video simulates robot and jet dynamics, where the latter uses the results obtained in the paper “Modeling, Identification and Control of Model Jet Engines for Jet Powered Robotics” published in IEEE Robotics and Automation Letters.

This video presents the paper entitled “Modeling, Identification and Control of Model Jet Engines for Jet Powered Robotics” published in IEEE Robotics and Automation Letters (Volume: 5 , Issue: 2 , April 2020 ) Page(s): 2070 – 2077. Preprint at https://arxiv.org/pdf/1909.13296.pdf.​

[ IIT ]

In a new pair of papers, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) came up with new tools to let robots better perceive what they’re interacting with: the ability to see and classify items, and a softer, delicate touch.

[ MIT CSAIL ]

UBTECH’s anti-epidemic solutions greatly relieve the workload of front-line medical staff and cut the consumption of personal protective equipment (PPE).

[ UBTECH ]

We demonstrate a method to assess the concrete deterioration in sewers by performing a tactile inspection motion with a sensorized foot of a legged robot.

[ THING ] via [ ANYmal Research ]

Get a closer look at the Virtual competition of the Urban Circuit and how teams can use the simulated environments to better prepare for the physical courses of the Subterranean Challenge.

[ SubT ]

Roboticists at the University of California San Diego have developed flexible feet that can help robots walk up to 40 percent faster on uneven terrain, such as pebbles and wood chips. The work has applications for search-and-rescue missions as well as space exploration.

[ UCSD ]

Thanks Ioana!

Tsuki is a ROS-enabled, highly dynamic quadruped robot developed by Lingkang Zhang.

And as far as we know, Lingkang is still chasing it.

[ Quadruped Tsuki ]

Thanks Lingkang!

Watch this.

This video shows an impressive demo of how YuMi’s superior precision, using precise servo gripper fingers and vacuum suction tool to pick up extremely small parts inside a mechanical watch. The video is not a final application used in production, it is a demo of how such an application can be implemented.

[ ABB ]

Meet Presso, the “5-minute dry cleaning robot.” Can you really call this a robot? We’re not sure. The company says it uses “soft robotics to hold the garment correctly, then clean, sanitize, press and dry under 5 minutes.” The machine was initially designed for use in the hospitality industry, but after adding a disinfectant function for COVID-19, it is now being used on movie and TV sets.

[ Presso ]

The next Mars rover launches next month (!), and here’s a look at some of the instruments on board.

[ JPL ]

Embodied Lead Engineer, Peter Teel, describes why we chose to build Moxie’s computing system from scratch and what makes it so unique.

[ Embodied ]

I did not know that this is where Pepper’s e-stop is. Nice design!

[ Softbank Robotics ]

State of the art in the field of swarm robotics lacks systems capable of absolute decentralization and is hence unable to mimic complex biological swarm systems consisting of simple units. Our research interconnects fields of swarm robotics and computer vision, and introduces novel use of a vision-based method UVDAR for mutual localization in swarm systems, allowing for absolute decentralization found among biological swarm systems. The developed methodology allows us to deploy real-world aerial swarming systems with robots directly localizing each other instead of communicating their states via a communication network, which is a typical bottleneck of current state of the art systems.

[ CVUT ]

I’m almost positive I could not do this task.

It’s easy to pick up objects using YuMi’s integrated vacuum functionality, it also supports ABB Robot’s Conveyor Tracking and Pickmaster 3 functionality, enabling it to track a moving conveyor and pick up objects using vision. Perfect for consumer products handling applications.

[ ABB ]

Cycling safety gestures, such as hand signals and shoulder checks, are an essential part of safe manoeuvring on the road. Child cyclists, in particular, might have difficulties performing safety gestures on the road or even forget about them, given the lack of cycling experience, road distractions and differences in motor and perceptual-motor abilities compared with adults. To support them, we designed two methods to remind about safety gestures while cycling. The first method employs an icon-based reminder in heads-up display (HUD) glasses and the second combines vibration on the handlebar and ambient light in the helmet. We investigated the performance of both methods in a controlled test-track experiment with 18 children using a mid-size tricycle, augmented with a set of sensors to recognize children’s behavior in real time. We found that both systems are successful in reminding children about safety gestures and have their unique advantages and disadvantages.

[ Paper ]

Nathan Sam and Robert “Red” Jensen fabricate and fly a Prandtl-M aircraft at NASA’s Armstrong Flight Research Center in California. The aircraft is the second of three prototypes of varying sizes to provide scientists with options to fly sensors in the Martian atmosphere to collect weather and landing site information for future human exploration of Mars.

[ NASA ]

This is clever: In order to minimize time spent labeling datasets, you can use radar to identify other vehicles, not because the radar can actually recognize other vehicles, but because the radar can recognize other stuff that’s big and moving, which turns out to be almost as good.

[ ICRA Paper ]

Happy 10th birthday to the Natural Robotics Lab at the University of Sheffield.

[ NRL ] Continue reading

Posted in Human Robots

#437859 We Can Do Better Than Human-Like Hands ...

One strategy for designing robots that are capable in anthropomorphic environments is to make the robots themselves as anthropomorphic as possible. It makes sense—for example, there are stairs all over the place because humans have legs, and legs are good at stairs, so if we give robots legs like humans, they’ll be good at stairs too, right? We also see this tendency when it comes to robotic grippers, because robots need to grip things that have been optimized for human hands.

Despite some amazing robotic hands inspired by the biology of our own human hands, there are also opportunities for creativity in gripper designs that do things human hands are not physically capable of. At ICRA 2020, researchers from Stanford University presented a paper on the design of a robotic hand that has fingers made of actuated rollers, allowing it to manipulate objects in ways that would tie your fingers into knots.

While it’s got a couple fingers, this prototype “roller grasper” hand tosses anthropomorphic design out the window in favor of unique methods of in-hand manipulation. The roller grasper does share some features with other grippers designed for in-hand manipulation using active surfaces (like conveyor belts embedded in fingers), but what’s new and exciting here is that those articulated active roller fingertips (or whatever non-anthropomorphic name you want to give them) provide active surfaces that are steerable. This means that the hand can grasp objects and rotate them without having to resort to complex sequences of finger repositioning, which is how humans do it.

Photo: Stanford University

Things like picking something flat off of a table, always tricky for robotic hands (and sometimes for human hands as well), is a breeze thanks to the fingertip rollers.

Each of the hand’s fingers has three actuated degrees of freedom, which result in several different ways in which objects can be grasped and manipulated. Things like picking something flat off of a table, always tricky for robotic hands (and sometimes for human hands as well), is a breeze thanks to the fingertip rollers. The motion of an object in this gripper isn’t quite holonomic, meaning that it can’t arbitrarily reorient things without sometimes going through other intermediate steps. And it’s also not compliant in the way that many other grippers are, limiting some types of grasps. This particular design probably won’t replace every gripper out there, but it’s particularly skilled at some specific kinds of manipulations in a way that makes it unique.

We should be clear that it’s not the intent of this paper (or of this article!) to belittle five-fingered robotic hands—the point is that there are lots of things that you can do with totally different hand designs, and just because humans use one kind of hand doesn’t mean that robots need to do the same if they want to match (or exceed) some specific human capabilities. If we could make robotic hands with five fingers that had all of the actuation and sensing and control that our own hands do, that would be amazing, but it’s probably decades away. In the meantime, there are plenty of different designs to explore.

And speaking of exploring different designs, these same folks are already at work on version two of their hand, which replaces the fingertip rollers with fingertip balls:

For more on this new version of the hand (among other things), we spoke with lead author Shenli Yuan via email. And the ICRA page is here if you have questions of your own.

IEEE Spectrum: Human hands are often seen as the standard for manipulation. When adding degrees of freedom that human hands don’t have (as in your work) can make robotic hands more capable than ours in many ways, do you think we should still think of human hands as something to try and emulate?

Shenli Yuan: Yes, definitely. Not only because human hands have great manipulation capability, but because we’re constantly surrounded by objects that were designed and built specifically to be manipulated by the human hand. Anthropomorphic robot hands are still worth investigating, and still have a long way to go before they truly match the dexterity of a human hand. The design we came up with is an exploration of what unique capabilities may be achieved if we are not bound by the constraints of anthropomorphism, and what a biologically impossible mechanism may achieve in robotic manipulation. In addition, for lots of tasks, it isn’t necessarily optimal to try and emulate the human hand. Perhaps in 20 to 50 years when robot manipulators are much better, they won’t look like the human hand that much. The design constraints for robotics and biology have points in common (like mechanical wear, finite tendons stiffness) but also major differences (like continuous rotation for robots and less heat dissipation problems for humans).

“For lots of tasks, it isn’t necessarily optimal to try and emulate the human hand. Perhaps in 20 to 50 years when robot manipulators are much better, they won’t look like the human hand that much.”
—Shenli Yuan, Stanford University

What are some manipulation capabilities of human hands that are the most difficult to replicate with your system?

There are a few things that come to mind. It cannot perform a power grasp (using the whole hand for grasping as opposed to pinch grasp that uses only fingertips), which is something that can be easily done by human hands. It cannot move or rotate objects instantaneously in arbitrary directions or about arbitrary axes, though the human hand is somewhat limited in this respect as well. It also cannot perform gaiting. That being said, these limitations exist largely because this grasper only has 9 degrees of freedom, as opposed to the human hand which has more than 20. We don’t think of this grasper as a replacement for anthropomorphic hands, but rather as a way to provide unique capabilities without all of the complexity associated with a highly actuated, humanlike hand.

What’s the most surprising or impressive thing that your hand is able to do?

The most impressive feature is that it can rotate objects continuously, which is typically difficult or inefficient for humanlike robot hands. Something really surprising was that we put most of our energy into the design and analysis of the grasper, and the control strategy we implemented for demonstrations is very simple. This simple control strategy works surprisingly well with very little tuning or trial-and-error.

With this many degrees of freedom, how complicated is it to get the hand to do what you want it to do?

The number of degrees of freedom is actually not what makes controlling it difficult. Most of the difficulties we encountered were actually due to the rolling contact between the rollers and the object during manipulation. The rolling behavior can be viewed as constantly breaking and re-establishing contacts between the rollers and objects, this very dynamic behavior introduces uncertainties in controlling our grasper. Specifically, it was difficult estimating the velocity of each contact point with the object, which changes based on object and finger position, object shape (especially curvature), and slip/no slip.

What more can you tell us about Roller Grasper V2?

Roller Grasper V2 has spherical rollers, while the V1 has cylindrical rollers. We realized that cylindrical rollers are very good at manipulating objects when the rollers and the object form line contacts, but it can be unstable when the grasp geometry doesn’t allow for a line contact between each roller and the grasped object. Spherical rollers solve that problem by allowing predictable points of contact regardless of how a surface is oriented.

The parallelogram mechanism of Roller Grasper V1 makes the pivot axis offset a bit from the center of the roller, which made our control and analysis more challenging. The kinematics of the Roller Grasper V2 is simpler. The base joint intersects with the finger, which intersects with the pivot joint, and the pivot joint intersects with the roller joint. It’s symmetrical design and simpler kinematics make our control and analysis a lot more straightforward. Roller Grasper V2 also has a larger pivot range of 180 degrees, while V1 is limited to 90 degrees.

In terms of control, we implemented more sophisticated control strategies (including a hand-crafted control strategy and an imitation learning based strategy) for the grasper to perform autonomous in-hand manipulation.

“Design of a Roller-Based Dexterous Hand for Object Grasping and Within-Hand Manipulation,” by Shenli Yuan, Austin D. Epps, Jerome B. Nowak, and J. Kenneth Salisbury from Stanford University is being presented at ICRA 2020.

< Back to IEEE Journal Watch Continue reading

Posted in Human Robots

#437857 Video Friday: Robotic Third Hand Helps ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICRA 2020 – June 1-15, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

We are seeing some exciting advances in the development of supernumerary robotic limbs. But one thing about this technology remains a major challenge: How do you control the extra limb if your own hands are busy—say, if you’re carrying a package? MIT researchers at Professor Harry Asada’s lab have an idea. They are using subtle finger movements in sensorized gloves to control the supernumerary limb. The results are promising, and they’ve demonstrated a waist-mounted arm with a qb SoftHand that can help you with doors, elevators, and even handshakes.

[ Paper ]

ROBOPANDA

Fluid actuated soft robots, or fluidic elastomer actuators, have shown great potential in robotic applications where large compliance and safe interaction are dominant concerns. They have been widely studied in wearable robotics, prosthetics, and rehabilitations in recent years. However, such soft robots and actuators are tethered to a bulky pump and controlled by various valves, limiting their applications to a small confined space. In this study, we report a new and effective approach to fluidic power actuation that is untethered, easy to design, fabricate, control, and allows various modes of actuation. In the proposed approach, a sealed elastic tube filled with fluid (gas or liquid) is segmented by adaptors. When twisting a segment, two major effects could be observed: (1) the twisted segment exhibits a contraction force and (2) other segments inflate or deform according to their constraint patterns.

[ Paper ]

And now: “Magnetic cilia carpets.”

[ ETH Zurich ]

To adhere to government recommendations while maintaining requirements for social distancing during the COVID-19 pandemic, Yaskawa Motoman is now utilizing an HC10DT collaborative robot to take individual employee temperatures. Named “Covie”, the design and fabrication of the robotic solution and its software was a combined effort by Yaskawa Motoman’s Technology Advancement Team (TAT) and Product Solutions Group (PSG), as well as a group of robotics students from the University of Dayton.

They should have programmed it to nod if your temperature was normal, and smacked you upside the head while yelling “GO HOME” if it wasn’t.

[ Yaskawa ]

Driving slowly on pre-defined routes, ZMP’s RakuRo autonomous vehicle helps people with mobility challenges enjoy cherry blossoms in Japan.

RakuRo costs about US $1,000 per month to rent, but ZMP suggests that facilities or groups of ~10 people could get together and share one, which makes the cost much more reasonable.

[ ZMP ]

Jessy Grizzle from the Dynamic Legged Locomotion Lab at the University of Michigan writes:

Our lab closed on March 20, 2020 under the State of Michigan’s “Stay Home, Stay Safe” order. For a 24-hour period, it seemed that our labs would be “sanitized” during our absence. Since we had no idea what that meant, we decided that Cassie Blue needed to “Stay Home, Stay Safe” as well. We loaded up a very expensive robot and took her off campus. On May 26, we were allowed to re-open our laboratory. After thoroughly cleaning the lab, disinfecting tools and surfaces, developing and getting approval for new safe operation procedures, we then re-organized our work areas to respect social distancing requirements and brought Cassie back to the laboratory.

During the roughly two months we were working remotely, the lab’s members got a lot done. Papers were written, dissertation proposals were composed, and plans for a new course, ROB 101, Computational Linear Algebra, were developed with colleagues. In addition, one of us (Yukai Gong) found the lockdown to his liking! He needed the long period of quiet to work through some new ideas for how to control 3D bipedal robots.

[ Michigan Robotics ]

Thanks Jesse and Bruce!

You can tell that this video of how Pepper has been useful during COVID-19 is not focused on the United States, since it refers to the pandemic in past tense.

[ Softbank Robotics ]

NASA’s water-seeking robotic Moon rover just booked a ride to the Moon’s South Pole. Astrobotic of Pittsburgh, Pennsylvania, has been selected to deliver the Volatiles Investigating Polar Exploration Rover, or VIPER, to the Moon in 2023.

[ NASA ]

This could be the most impressive robotic gripper demo I have ever seen.

[ Soft Robotics ]

Whiz, an autonomous vacuum sweeper, innovates the cleaning industry by automating tedious tasks for your team. Easy to train, easy to use, Whiz works with your staff to deliver a high-quality clean while increasing efficiency and productivity.

[ Softbank Robotics ]

About 40 seconds into this video, a robot briefly chases a goose.

[ Ghost Robotics ]

SwarmRail is a new concept for rail-guided omnidirectional mobile robot systems. It aims for a highly flexible production process in the factory of the future by opening up the available work space from above. This means that transport and manipulation tasks can be carried out by floor- and ceiling-bound robot systems. The special feature of the system is the combination of omnidirectionally mobile units with a grid-shaped rail network, which is characterized by passive crossings and a continuous gap between the running surfaces of the rails. Through this gap, a manipulator operating below the rail can be connected to a mobile unit traveling on the rail.

[ DLRRMC ]

RightHand Robotics (RHR), a leader in providing robotic piece-picking solutions, is partnered with PALTAC Corporation, Japan’s largest wholesaler of consumer packaged goods. The collaboration introduces RightHand’s newest piece-picking solution to the Japanese market, with multiple workstations installed in PALTAC’s newest facility, RDC Saitama, which opened in 2019 in Sugito, Saitama Prefecture, Japan.

[ RightHand Robotics ]

From the ICRA 2020, a debate on the “Future of Robotics Research,” addressing such issues as “robotics research is over-reliant on benchmark datasets and simulation” and “robots designed for personal or household use have failed because of fundamental misunderstandings of Human-Robot Interaction (HRI).”

[ Robotics Debates ]

MassRobotics has a series of interviews where robotics celebrities are interviewed by high school students.The students are perhaps a little awkward (remember being in high school?), but it’s honest and the questions are interesting. The first two interviews are with Laurie Leshin, who worked on space robots at NASA and is now President of Worcester Polytechnic Institute, and Colin Angle, founder and CEO of iRobot.

[ MassRobotics ]

Thanks Andrew!

In this episode of the Voices from DARPA podcast, Dr. Timothy Chung, a program manager since 2016 in the agency’s Tactical Technology Office, delves into his robotics and autonomous technology programs – the Subterranean (SubT) Challenge and OFFensive Swarm-Enabled Tactics (OFFSET). From robot soccer to live-fly experimentation programs involving dozens of unmanned aircraft systems (UASs), he explains how he aims to assist humans heading into unknown environments via advances in collaborative autonomy and robotics.

[ DARPA ] Continue reading

Posted in Human Robots

#437826 Video Friday: Skydio 2 Drone Is Back on ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Skydio, which makes what we’re pretty sure is the most intelligent consumer drone (or maybe just drone period) in existence, has been dealing with COVID-19 just like the rest of us. Even so, they’ve managed to push out a major software update, and pre-orders for the Skydio 2 are now open again.

If you think you might want one, read our review, after which you’ll be sure you want one.

[ Skydio ]

Worried about people with COVID entering your workplace? Misty II has your front desk covered, in a way that’s quite a bit friendlier than many other options.

Misty II provides a dynamic and interactive screening experience that delivers a joyful experience in an otherwise depressing moment while also delivering state of the art thermal scanning and health screening. We have already found that employees, customers, and visitors appreciate the novelty of interacting with a clever and personable robot. Misty II engages dynamically, both visually and verbally. Companies appreciate using a solution with a blackbody-referenced thermal camera that provides high accuracy and a short screening process for efficiency. Putting a robot to work in this role shifts not only how people look at the screening process but also how robots can take on useful assignments in business, schools and homes.

[ Misty Robotics ]

Thanks Tim!

I’m definitely the one in the middle.

[ Agility Robotics ]

NASA’s Ingenuity helicopter is traveling to Mars attached to the belly of the Perseverance rover and must safely detach to begin the first attempt at powered flight on another planet. Tests done at NASA’s Jet Propulsion Laboratory and Lockheed Martin Space show the sequence of events that will bring the helicopter down to the Martian surface.

[ JPL ]

Here’s a sequence of videos of Cassie Blue making it (or mostly making it) up a 22-degree slope.

My mood these days is Cassie at 1:09.

[ University of Michigan ]

Thanks Jesse!

This is somewhere on the line between home automation and robotics, but it’s a cool idea: A baby crib that “uses computer vision and machine learning to recognize subtle changes” in an infant’s movement, and proactively bounces them to keep them sleeping peacefully.

It costs $1000, but how much value do you put on 24 months of your own sleep?

[ Cradlewise ]

Thanks Ben!

As captive marine mammal shows have fallen from favor; and the catching, transporting and breeding of marine animals has become more restricted, the marine park industry as a viable business has become more challenging – yet the audience appetite for this type of entertainment and education has remained constant.

Real-time Animatronics provide a way to reinvent the marine entertainment industry with a sustainable, safe, and profitable future. Show venues include aquariums, marine parks, theme parks, fountain shows, cruise lines, resort hotels, shopping malls, museums, and more.

[ EdgeFX ] via [ Gizmodo ]

Robotic cabling is surprisingly complex and kinda cool to watch.

The video shows the sophisticated robot application “Automatic control cabinet cabling”, which Fraunhofer IPA implemented together with the company Rittal. The software pitasc, developed at Fraunhofer IPA, is used for force-controlled assembly processes. Two UR robot arms carry out the task together. The modular pitasc system enables the robot arms to move and rotate in parallel. They work hand in hand, with one robot holding the cable and the second bringing it to the starting position for the cabling. The robots can find, tighten, hold ready, lay, plug in, fix, move freely or immerse cables. They can also perform push-ins and pull tests.

[ Fraunhofer ]

This is from 2018, but the concept is still pretty neat.

We propose to perform a novel investigation into the ability of a propulsively hopping robot to reach targets of high science value on the icy, rugged terrains of Ocean Worlds. The employment of a multi-hop architecture allows for the rapid traverse of great distances, enabling a single mission to reach multiple geologic units within a timespan conducive to system survival in a harsh radiation environment. We further propose that the use of a propulsive hopping technique obviates the need for terrain topographic and strength assumptions and allows for complete terrain agnosticism; a key strength of this concept.

[ NASA ]

Aerial-aquatic robots possess the unique ability of operating in both air and water. However, this capability comes with tremendous challenges, such as communication incompati- bility, increased airborne mass, potentially inefficient operation in each of the environments and manufacturing difficulties. Such robots, therefore, typically have small payloads and a limited operational envelope, often making their field usage impractical. We propose a novel robotic water sampling approach that combines the robust technologies of multirotors and underwater micro-vehicles into a single integrated tool usable for field operations.

[ Imperial ]

Event cameras are bio-inspired vision sensors with microsecond latency resolution, much larger dynamic range and hundred times lower power consumption than standard cameras. This 20-minute talk gives a short tutorial on event cameras and show their applications on computer vision, drones, and cars.

[ UZH ]

We interviewed Paul Newman, Perla Maiolino and Lars Kunze, ORI academics, to hear what gets them excited about robots in the future and any advice they have for those interested in the field.

[ Oxford Robotics Institute ]

Two projects from the Rehabilitation Engineering Lab at ETH Zurich, including a self-stabilizing wheelchair and a soft exoskeleton for grasping assistance.

[ ETH Zurich ]

Silicon Valley Robotics hosted an online conversation about robotics and racism. Moderated by Andra Keay, the panel featured Maynard Holliday, Tom Williams, Monroe Kennedy III, Jasmine Lawrence, Chad Jenkins, and Ken Goldberg.

[ SVR ]

The ICRA Legged Locomotion workshop has been taking place online, and while we’re not getting a robot mosh pit, there are still some great talks. We’ll post two here, but for more, follow the legged robots YouTube channel at the link below.

[ YouTube ] Continue reading

Posted in Human Robots

#437820 In-Shoe Sensors and Mobile Robots Keep ...

In shoe sensor

Researchers at Stevens Institute of Technology are leveraging some of the newest mechanical and robotic technologies to help some of our oldest populations stay healthy, active, and independent.

Yi Guo, professor of electrical and computer engineering and director of the Robotics and Automation Laboratory, and Damiano Zanotto, assistant professor of mechanical engineering, and director of the Wearable Robotic Systems Laboratory, are collaborating with Ashley Lytle, assistant professor in Stevens’ College of Arts and Letters, and Ashwini K. Rao of Columbia University Medical Center, to combine an assistive mobile robot companion with wearable in-shoe sensors in a system designed to help elderly individuals maintain the balance and motion they need to thrive.

“Balance and motion can be significant issues for this population, and if elderly people fall and experience an injury, they are less likely to stay fit and exercise,” Guo said. “As a consequence, their level of fitness and performance decreases. Our mobile robot companion can help decrease the chances of falling and contribute to a healthy lifestyle by keeping their walking function at a good level.”

The mobile robots are designed to lead walking sessions and using the in-shoe sensors, monitor the user’s gait, indicate issues, and adjust the exercise speed and pace. The initiative is part of a four-year National Science Foundation research project.

“For the first time, we’re integrating our wearable sensing technology with an autonomous mobile robot,” said Zanotto, who worked with elderly people at Columbia University Medical Center for three years before coming to Stevens in 2016. “It’s exciting to be combining these different areas of expertise to leverage the strong points of wearable sensing technology, such as accurately capturing human movement, with the advantages of mobile robotics, such as much larger computational powers.”

The team is developing algorithms that fuse real-time data from smart, unobtrusive, in-shoe sensors and advanced on-board sensors to inform the robot’s navigation protocols and control the way the robot interacts with elderly individuals. It’s a promising way to assist seniors in safely doing walking exercises and maintaining their quality of life.

Bringing the benefits of the lab to life

Guo and Zanotto are working with Lytle, an expert in social and health psychology, to implement a social connectivity capability and make the bi-directional interaction between human and robot even more intuitive, engaging, and meaningful for seniors.

“Especially during COVID, it’s important for elderly people living on their own to connect socially with family and friends,” Zanotto said, “and the robot companion will also offer teleconferencing tools to provide that interaction in an intuitive and transparent way.”

“We want to use the robot for social connectedness, perhaps integrating it with a conversation agent such as Alexa,” Guo added. “The goal is to make it a companion robot that can sense, for example, that you are cooking, or you’re in the living room, and help with things you would do there.”

It’s a powerful example of how abstract concepts can have meaningful real-life benefits.

“As engineers, we tend to work in the lab, trying to optimize our algorithms and devices and technologies,” Zanotto noted, “but at the end of the day, what we do has limited value unless it has impact on real life. It’s fascinating to see how the devices and technologies we’re developing in the lab can be applied to make a difference for real people.”

Maintaining balance in a global pandemic

Although COVID-19 has delayed the planned testing at a senior center in New York City, it has not stopped the team’s progress.

“Although we can’t test on elderly populations yet, our students are still testing in the lab,” Guo said. “This summer and fall, for the first time, the students validated the system’s real-time ability to monitor and assess the dynamic margin of stability during walking—in other words, to evaluate whether the person following the robot is walking normally or has a risk of falling. They’re also designing parameters for the robot to give early warnings and feedback that help the human subjects correct posture and gait issues while walking.”

Those warnings would be literally underfoot, as the in-shoe sensors would pulse like a vibrating cell phone to deliver immediate directional information to the subject.

“We’re not the first to use this vibrotactile stimuli technology, but this application is new,” Zanotto said.

So far, the team has published papers in top robotics publication venues including IEEE Transactions on Neural Systems and Rehabilitation Engineering and the 2020 IEEE International Conference on Robotics and Automation (ICRA). It’s a big step toward realizing the synergies of bringing the technical expertise of engineers to bear on the clinical focus on biometrics—and the real lives of seniors everywhere. Continue reading

Posted in Human Robots