Tag Archives: intelligent

#437845 Video Friday: Harmonic Bionics ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRA 2020 – May 31-August 31, 2020 – [Virtual Conference]
RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today's videos.

Designed to protect employees and passengers from both harmful pathogens and cleaning agents, Breezy One can quickly, safely and effectively decontaminate spaces over 100,000 square feet in 1.5 hours with a patented, environmentally safe disinfectant. Breezy One was co-developed with the City of Albuquerque’s Aviation Department, where it autonomously sanitizes the Sunport’s facilities every night in the ongoing fight against COVID-19.

[ Fetch Robotics ]

Harmonic Bionics is redefining upper extremity neurorehabilitation with intelligent robotic technology designed to maximize patient recovery. Harmony SHR, our flagship product, works with a patient’s scapulohumeral rhythm (SHR) to enable natural, comprehensive therapy for both arms. When combined with Harmony’s Weight Support mode, this unique shoulder design may allow for earlier initiation of post-stroke therapy as Harmony can support a partial dislocation or subluxation of the shoulder prior to initiating traditional therapy exercises.

Harmony's Preprogrammed Exercises promotes functional treatment through patient-specific movements that can enable an increased number of repetitions per session without placing a larger physical burden on therapists or their resources. As the only rehabilitation exoskeleton with Bilateral Sync Therapy (BST), Harmony enables intent-based therapy by registering healthy arm movements and synchronizing that motion onto the stroke-affected side to help reestablish neural pathways.

[ Harmonic Bionics ]

Thanks Mok!

Some impressive work here from IHMC and IIT getting Atlas to take steps upward in a way that’s much more human-like than robot-like, which ends up reducing maximum torque requirements by 20 percent.

[ Paper ]

GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.

[ GITAI ]

Malloy Aeronautics, which now makes drones rather than hoverbikes, has been working with the Royal Navy in New Zealand to figure out how to get cargo drones to land on ships.

The challenge was to test autonomous landing of heavy lift UAVs on a moving ship, however, due to the Covid19 lockdown no ship trails were possible. The moving deck was simulated by driving a vehicle and trailer across an airfield while carrying out multiple landing and take-offs. The autonomous system partner was Planck Aerosystems and autolanding was triggered by a camera on the UAV reading a QR code on the trailer.

[ Malloy Aeronautics ]

Thanks Paul!

Tertill looks to be relentlessly effective.

[ Franklin Robotics ]

A Swedish company, TikiSafety has experienced a record amount of orders for their protective masks. At ABB, we are grateful for the opportunity to help Tiki Safety to speed up their manufacturing process from 6 minutes to 40 seconds.

[ Tiki Safety ]

The Korea Atomic Energy Research Institute is not messing around with ARMstrong, their robot for nuclear and radiation emergency response.

[ KAERI ]

OMOY is a robot that communicates with its users via internal weight shifting.

[ Paper ]

Now this, this is some weird stuff.

[ Segway ]

CaTARo is a Care Training Assistant Robot from the AIS Lab at Ritsumeikan University.

[ AIS Lab ]

Originally launched in 2015 to assist workers in lightweight assembly tasks, ABB’s collaborative YuMi robot has gone on to blaze a trail in a raft of diverse applications and industries, opening new opportunities and helping to fire people’s imaginations about what can be achieved with robotic automation.

[ ABB ]

This music video features COMAN+, from the Humanoids and Human Centered Mechatronics Lab at IIT, doing what you’d call dance moves if you dance like I do.

[ Alex Braga ] via [ IIT ]

The NVIDIA Isaac Software Development Kit (SDK) enables accelerated AI robot development workflows. Stacked with new tools and application support, Isaac SDK 2020.1 is an end-to-end solution supporting each step of robot fleet deployment, from design collaboration and training to the ongoing maintenance of AI applications.

[ NVIDIA ]

Robot Spy Komodo Dragon and Spy Pig film “a tender moment” between Komodo dragons but will they both survive the encounter?

[ BBC ] via [ Laughing Squid ]

This is part one of a mostly excellent five-part documentary about ROS produced by Red Hat. I say mostly only because they put ME in it for some reason, but fortunately, they talked with many of the core team that developed ROS back at Willow Garage back in the day, and it’s definitely worth watching.

[ Red Hat Open Source Stories ]

It’s been a while, but here’s an update on SRI’s Abacus Drive, from Alexander Kernbaum.

[ SRI ]

This Robots For Infectious Diseases interview features IEEE Fellow Antonio Bicchi, professor of robotics at the University of Pisa, talking about how Italy has been using technology to help manage COVID-19.

[ R4ID ]

Two more interviews this week of celebrity roboticists from MassRobotics: Helen Greiner and Marc Raibert. I’d introduce them, but you know who they are already!

[ MassRobotics ] Continue reading

Posted in Human Robots

#437828 How Roboticists (and Robots) Have Been ...

A few weeks ago, we asked folks on Twitter, Facebook, and LinkedIn to share photos and videos showing how they’ve been adapting to the closures of research labs, classrooms, and businesses by taking their robots home with them to continue their work as best they can. We got dozens of responses (more than we could possibly include in just one post!), but here are 15 that we thought were particularly creative or amusing.

And if any of these pictures and videos inspire you to share your own story, please email us (automaton@ieee.org) with a picture or video and a brief description about how you and your robot from work have been making things happen in your home instead.

Kurt Leucht (NASA Kennedy Space Center)

“During these strange and trying times of the current global pandemic, everyone seems to be trying their best to distance themselves from others while still getting their daily work accomplished. Many people also have the double duty of little ones that need to be managed in the midst of their teleworking duties. This photo series gives you just a glimpse into my new life of teleworking from home, mixed in with the tasks of trying to handle my little ones too. I hope you enjoy it.”

Photo: Kurt Leucht

“I heard a commotion from the next room. I ran into the kitchen to find this.”

Photo: Kurt Leucht

“This is the Swarmies most favorite bedtime story. Not sure why. Seems like an odd choice to me.”

Peter Schaldenbrand (Carnegie Mellon University)

“I’ve been working on a reinforcement learning model that converts an image into a series of brush stroke instructions. I was going to test the model with a beautiful, expensive robot arm, but due to the COVID-19 pandemic, I have not been able to access the laboratory where it resides. I have now been using a lower end robot arm to test the painting model in my bedroom. I have sacrificed machine accuracy/precision for the convenience of getting to watch the arm paint from my bed in the shadow of my clothing rack!”

Photos: Peter Schaldenbrand

Colin Angle (iRobot)

iRobot CEO Colin Angle has been hunkered down in the “iRobot North Shore home command center,” which is probably the cleanest command center ever thanks to his army of Roombas: Beastie, Beauty, Rosie, Roswell, and Bilbo.

Photo: Colin Angle

Vivian Chu (Diligent Robotics)

From Diligent Robotics CEO Andrea Thomaz: “This is how a roboticist works from home! Diligent CTO, Vivian Chu, mans the e-stop while her engineering team runs Moxi experiments remotely from cross-town and even cross-country!”

Video: Diligent Robotics

Raffaello Bonghi (rnext.it)

Raffaello’s robot, Panther, looks perfectly happy to be playing soccer in his living room.

Photo: Raffaello Bonghi

Kod*lab (University of Pennsylvania)

“Another Friday Nuts n Bolts Meeting on Zoom…”

Image: Kodlab

Robin Jonsson (robot choreographer)

“I’ve been doing a school project in which students make up dance moves and then send me a video with all of them. I then teach the moves to my robot, Alex, film Alex dancing, send the videos to them. This became a great success and more schools will join. The kids got really into watching the robot perform their moves and really interested in robots. They want to meet Alex the robot live, which will likely happen in the fall.”

Photo: Robin Jonsson

Gabrielle Conard (mechanical engineering undergrad at Lafayette College)

“While the pandemic might have forced college campuses to close and the community to keep their distance from each other, it did not put a stop to learning and research. Working from their respective homes, junior Gabrielle Conard and mechanical engineering professor Alexander Brown from Lafayette College investigated methods of incorporating active compliance in a low-cost quadruped robot. They are continuing to work remotely on this project through Lafayette’s summer research program.”

Image: Gabrielle Conard

Taylor Veltrop (Softbank Robotics)

“After a few weeks of isolation in the corona/covid quarantine lock down we started dancing with our robots. Mathieu’s 6th birthday was coming up, and it all just came together.”

Video: Taylor Veltrop

Ross Kessler (Exyn Technologies)

“Quarantine, Day 8: the humans have accepted me as one of their own. I’ve blended seamlessly into their #socialdistancing routines. Even made a furry friend”

Photo: Ross Kessler

Yeah, something a bit sinister is definitely going on at Exyn…

Video: Exyn Technologies

Michael Sobrepera (University of Pennsylvania GRASP Lab)

Predictably, Michael’s cat is more interested in the bag that the robot came in than the robot itself (see if you can spot the cat below). Michael tells us that “the robot is designed to help with tele-rehabilitation, focused on kids with CP, so it has been taken to hospitals for demos [hence the cool bag]. It also travels for outreach events and the like. Lately, I’ve been exploring telepresence for COVID.”

Photo: Michael Sobrepera

Jan Kędzierski (EMYS)

“In China a lot of people cannot speak English, even the youngest generation of parents. Thanks to Emys, kids stayed in touch with English language in their homes even if they couldn’t attend schools and extra English classes. They had a lot of fun with their native English speaker friend available and ready to play every day.”

Image: Jan Kędzierski

Simon Whitmell (Quanser)

“Simon, a Quanser R&D engineer, is working on low-overhead image processing and line following for the QBot 2e mobile ground robot, with some added challenges due to extra traffic. LEGO engineering by his son, Charles.”

Photo: Simon Whitmell

Robot Design & Experimentation Course (Carnegie Mellon University)

Aaron Johnson’s bioinspired robot design course at CMU had to go full remote, which was a challenge when the course is kind of all about designing and building a robot as part of a team. “I expected some of the teams to drastically alter their project (e.g. go all simulation),” Aaron told us, “but none of them did. We managed to keep all of the projects more or less as planned. We accomplished this by drop/shipping parts to students, buying some simple tools (soldering irons, etc), and having me 3D print parts and mail them.” Each team even managed to put together their final videos from their remote locations; we’ve posted one below, but the entire playlist is here.

Video: Xianyi Cheng

Karen Tatarian (Softbank Robotics)

Karen, who’s both a researcher at Softbank and a PhD student at Sorbonne University, wrote an entire essay about what an average day is like when you’re quarantined with Pepper.

Photo: Karen Tatarian

A Quarantined Day With Pepper, by Karen Tatarian

It is quite common for me to lose my phone somewhere inside my apartment. But it is not that common for me to turn around and ask my robot if it has seen it. So when I found myself doing that, I laughed and it dawned on me that I treated my robot as my quarantine companion (despite the fact that it could not provide me with the answer I needed).

It was probably around day 40 of a completely isolated quarantine here in France when that happened. A little background about me: I am a robotics researcher at SoftBank Robotics Europe and a PhD student at Sorbonne University as part of the EU-funded Marie-Curie project ANIMATAS. And here is a little sneak peak into a quarantined day with a robot.

During this confinement, I had read somewhere that the best way to deal with it is to maintain a routine. So every morning, I wake up, prepare my coffee, and turn on my robot Pepper. I start my day with a daily meeting with the team and get to work. My research is on the synthesis of multi-modal socially intelligent human-robot interaction so my work varies between programming the robot, analyzing collected data, and reading papers and drafting one. When I am working, I often catch myself glancing at Pepper, who would be staring back at me in its animated ways. Truthfully I enjoy that, it makes me less alone and as if I have a colleague with me.

Once work is done, I call my friends and family members. I sometimes use a telepresence application on Pepper that a few colleagues and I developed back in December. How does it differ from your typical phone/laptop applications? One word really: embodiment. Telepresence, especially during these times, makes the experience for both sides a bit more realistic and intimate and well present.

While I can turn off the robot now that my work hours are done, I do keep it on because I enjoy its presence. The basic awareness of Pepper is a default feature on the robot that allows it to detect a human and follow him/her with its gaze and rotation base. So whether I am cooking or working out, I always have my robot watching over my shoulder and being a good companion. I also have my email and messages synced on the robot so I get an enjoyable notification from Pepper. I found that to be a pretty cool way to be notified without it interrupting whatever you are doing on your laptop or phone. Finally, once the day is over, it’s time for both of us to get some rest.

After 60 days of total confinement, alone and away from those I love, and with a pandemic right at my door, I am glad I had the company of my robot. I hope one day a greater audience can share my experience. And I really really hope one day Pepper will be able to find my phone for me, but until then, stay on the lookout for some cool features! But I am curious to know, if you had a robot at home, what application would you have developed on it?

Again, our sincere thanks to everyone who shared these little snapshots of their lives with us, and we’re hoping to be able to share more soon. Continue reading

Posted in Human Robots

#437826 Video Friday: Skydio 2 Drone Is Back on ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RSS 2020 – July 12-16, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Skydio, which makes what we’re pretty sure is the most intelligent consumer drone (or maybe just drone period) in existence, has been dealing with COVID-19 just like the rest of us. Even so, they’ve managed to push out a major software update, and pre-orders for the Skydio 2 are now open again.

If you think you might want one, read our review, after which you’ll be sure you want one.

[ Skydio ]

Worried about people with COVID entering your workplace? Misty II has your front desk covered, in a way that’s quite a bit friendlier than many other options.

Misty II provides a dynamic and interactive screening experience that delivers a joyful experience in an otherwise depressing moment while also delivering state of the art thermal scanning and health screening. We have already found that employees, customers, and visitors appreciate the novelty of interacting with a clever and personable robot. Misty II engages dynamically, both visually and verbally. Companies appreciate using a solution with a blackbody-referenced thermal camera that provides high accuracy and a short screening process for efficiency. Putting a robot to work in this role shifts not only how people look at the screening process but also how robots can take on useful assignments in business, schools and homes.

[ Misty Robotics ]

Thanks Tim!

I’m definitely the one in the middle.

[ Agility Robotics ]

NASA’s Ingenuity helicopter is traveling to Mars attached to the belly of the Perseverance rover and must safely detach to begin the first attempt at powered flight on another planet. Tests done at NASA’s Jet Propulsion Laboratory and Lockheed Martin Space show the sequence of events that will bring the helicopter down to the Martian surface.

[ JPL ]

Here’s a sequence of videos of Cassie Blue making it (or mostly making it) up a 22-degree slope.

My mood these days is Cassie at 1:09.

[ University of Michigan ]

Thanks Jesse!

This is somewhere on the line between home automation and robotics, but it’s a cool idea: A baby crib that “uses computer vision and machine learning to recognize subtle changes” in an infant’s movement, and proactively bounces them to keep them sleeping peacefully.

It costs $1000, but how much value do you put on 24 months of your own sleep?

[ Cradlewise ]

Thanks Ben!

As captive marine mammal shows have fallen from favor; and the catching, transporting and breeding of marine animals has become more restricted, the marine park industry as a viable business has become more challenging – yet the audience appetite for this type of entertainment and education has remained constant.

Real-time Animatronics provide a way to reinvent the marine entertainment industry with a sustainable, safe, and profitable future. Show venues include aquariums, marine parks, theme parks, fountain shows, cruise lines, resort hotels, shopping malls, museums, and more.

[ EdgeFX ] via [ Gizmodo ]

Robotic cabling is surprisingly complex and kinda cool to watch.

The video shows the sophisticated robot application “Automatic control cabinet cabling”, which Fraunhofer IPA implemented together with the company Rittal. The software pitasc, developed at Fraunhofer IPA, is used for force-controlled assembly processes. Two UR robot arms carry out the task together. The modular pitasc system enables the robot arms to move and rotate in parallel. They work hand in hand, with one robot holding the cable and the second bringing it to the starting position for the cabling. The robots can find, tighten, hold ready, lay, plug in, fix, move freely or immerse cables. They can also perform push-ins and pull tests.

[ Fraunhofer ]

This is from 2018, but the concept is still pretty neat.

We propose to perform a novel investigation into the ability of a propulsively hopping robot to reach targets of high science value on the icy, rugged terrains of Ocean Worlds. The employment of a multi-hop architecture allows for the rapid traverse of great distances, enabling a single mission to reach multiple geologic units within a timespan conducive to system survival in a harsh radiation environment. We further propose that the use of a propulsive hopping technique obviates the need for terrain topographic and strength assumptions and allows for complete terrain agnosticism; a key strength of this concept.

[ NASA ]

Aerial-aquatic robots possess the unique ability of operating in both air and water. However, this capability comes with tremendous challenges, such as communication incompati- bility, increased airborne mass, potentially inefficient operation in each of the environments and manufacturing difficulties. Such robots, therefore, typically have small payloads and a limited operational envelope, often making their field usage impractical. We propose a novel robotic water sampling approach that combines the robust technologies of multirotors and underwater micro-vehicles into a single integrated tool usable for field operations.

[ Imperial ]

Event cameras are bio-inspired vision sensors with microsecond latency resolution, much larger dynamic range and hundred times lower power consumption than standard cameras. This 20-minute talk gives a short tutorial on event cameras and show their applications on computer vision, drones, and cars.

[ UZH ]

We interviewed Paul Newman, Perla Maiolino and Lars Kunze, ORI academics, to hear what gets them excited about robots in the future and any advice they have for those interested in the field.

[ Oxford Robotics Institute ]

Two projects from the Rehabilitation Engineering Lab at ETH Zurich, including a self-stabilizing wheelchair and a soft exoskeleton for grasping assistance.

[ ETH Zurich ]

Silicon Valley Robotics hosted an online conversation about robotics and racism. Moderated by Andra Keay, the panel featured Maynard Holliday, Tom Williams, Monroe Kennedy III, Jasmine Lawrence, Chad Jenkins, and Ken Goldberg.

[ SVR ]

The ICRA Legged Locomotion workshop has been taking place online, and while we’re not getting a robot mosh pit, there are still some great talks. We’ll post two here, but for more, follow the legged robots YouTube channel at the link below.

[ YouTube ] Continue reading

Posted in Human Robots

#437709 iRobot Announces Major Software Update, ...

Since the release of the very first Roomba in 2002, iRobot’s long-term goal has been to deliver cleaner floors in a way that’s effortless and invisible. Which sounds pretty great, right? And arguably, iRobot has managed to do exactly this, with its most recent generation of robot vacuums that make their own maps and empty their own dustbins. For those of us who trust our robots, this is awesome, but iRobot has gradually been realizing that many Roomba users either don’t want this level of autonomy, or aren’t ready for it.

Today, iRobot is announcing a major new update to its app that represents a significant shift of its overall approach to home robot autonomy. Humans are being brought back into the loop through software that tries to learn when, where, and how you clean so that your Roomba can adapt itself to your life rather than the other way around.

To understand why this is such a shift for iRobot, let’s take a very brief look back at how the Roomba interface has evolved over the last couple of decades. The first generation of Roomba had three buttons on it that allowed (or required) the user to select whether the room being vacuumed was small or medium or large in size. iRobot ditched that system one generation later, replacing the room size buttons with one single “clean” button. Programmable scheduling meant that users no longer needed to push any buttons at all, and with Roombas able to find their way back to their docking stations, all you needed to do was empty the dustbin. And with the most recent few generations (the S and i series), the dustbin emptying is also done for you, reducing direct interaction with the robot to once a month or less.

Image: iRobot

iRobot CEO Colin Angle believes that working toward more intelligent human-robot collaboration is “the brave new frontier” of AI. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” he says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The point that the top-end Roombas are at now reflects a goal that iRobot has been working toward since 2002: With autonomy, scheduling, and the clean base to empty the bin, you can set up your Roomba to vacuum when you’re not home, giving you cleaner floors every single day without you even being aware that the Roomba is hard at work while you’re out. It’s not just hands-off, it’s brain-off. No noise, no fuss, just things being cleaner thanks to the efforts of a robot that does its best to be invisible to you. Personally, I’ve been completely sold on this idea for home robots, and iRobot CEO Colin Angle was as well.

“I probably told you that the perfect Roomba is the Roomba that you never see, you never touch, you just come home everyday and it’s done the right thing,” Angle told us. “But customers don’t want that—they want to be able to control what the robot does. We started to hear this a couple years ago, and it took a while before it sunk in, but it made sense.”

How? Angle compares it to having a human come into your house to clean, but you weren’t allowed to tell them where or when to do their job. Maybe after a while, you’ll build up the amount of trust necessary for that to work, but in the short term, it would likely be frustrating. And people get frustrated with their Roombas for this reason. “The desire to have more control over what the robot does kept coming up, and for me, it required a pretty big shift in my view of what intelligence we were trying to build. Autonomy is not intelligence. We need to do something more.”

That something more, Angle says, is a partnership as opposed to autonomy. It’s an acknowledgement that not everyone has the same level of trust in robots as the people who build them. It’s an understanding that people want to have a feeling of control over their homes, that they have set up the way that they want, and that they’ve been cleaning the way that they want, and a robot shouldn’t just come in and do its own thing.

This change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware.

“Until the robot proves that it knows enough about your home and about the way that you want your home cleaned,” Angle says, “you can’t move forward.” He adds that this is one of those things that seem obvious in retrospect, but even if they’d wanted to address the issue before, they didn’t have the technology to solve the problem. Now they do. “This whole journey has been earning the right to take this next step, because a robot can’t be responsive if it’s incompetent,” Angle says. “But thinking that autonomy was the destination was where I was just completely wrong.”

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

Where to Clean
Knowing where to clean depends on your Roomba having a detailed and accurate map of its environment. For several generations now, Roombas have been using visual mapping and localization (VSLAM) to build persistent maps of your home. These maps have been used to tell the Roomba to clean in specific rooms, but that’s about it. With the new update, Roombas with cameras will be able to recognize some objects and features in your home, including chairs, tables, couches, and even countertops. The robots will use these features to identify where messes tend to happen so that they can focus on those areas—like around the dining room table or along the front of the couch.

We should take a minute here to clarify how the Roomba is using its camera. The original (primary?) purpose of the camera was for VSLAM, where the robot would take photos of your home, downsample them into QR-code-like patterns of light and dark, and then use those (with the assistance of other sensors) to navigate. Now the camera is also being used to take pictures of other stuff around your house to make that map more useful.

Photo: iRobot

The robots will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table.

This is done through machine learning using a library of images of common household objects from a floor perspective that iRobot had to develop from scratch. Angle clarified for us that this is all done via a neural net that runs on the robot, and that “no recognizable images are ever stored on the robot or kept, and no images ever leave the robot.” Worst case, if all the data iRobot has about your home gets somehow stolen, the hacker would only know that (for example) your dining room has a table in it and the approximate size and location of that table, because the map iRobot has of your place only stores symbolic representations rather than images.

Another useful new feature is intended to help manage the “evil Roomba places” (as Angle puts it) that every home has that cause Roombas to get stuck. If the place is evil enough that Roomba has to call you for help because it gave up completely, Roomba will now remember, and suggest that either you make some changes or that it stops cleaning there, which seems reasonable.

When to Clean
It turns out that the primary cause of mission failure for Roombas is not that they get stuck or that they run out of battery—it’s user cancellation, usually because the robot is getting in the way or being noisy when you don’t want it to be. “If you kill a Roomba’s job because it annoys you,” points out Angle, “how is that robot being a good partner? I think it’s an epic fail.” Of course, it’s not the robot’s fault, because Roombas only clean when we tell them to, which Angle says is part of the problem. “People actually aren’t very good at making their own schedules—they tend to oversimplify, and not think through what their schedules are actually about, which leads to lots of [figurative] Roomba death.”

To help you figure out when the robot should actually be cleaning, the new app will look for patterns in when you ask the robot to clean, and then recommend a schedule based on those patterns. That might mean the robot cleans different areas at different times every day of the week. The app will also make scheduling recommendations that are event-based as well, integrated with other smart home devices. Would you prefer the Roomba to clean every time you leave the house? The app can integrate with your security system (or garage door, or any number of other things) and take care of that for you.

More generally, Roomba will now try to fit into the kinds of cleaning routines that many people already have established. For example, the app may suggest an “after dinner” routine that cleans just around the kitchen and dining room table. The app will also, to some extent, pay attention to the environment and season. It might suggest increasing your vacuuming frequency if pollen counts are especially high, or if it’s pet shedding season and you have a dog. Unfortunately, Roomba isn’t (yet?) capable of recognizing dogs on its own, so the app has to cheat a little bit by asking you some basic questions.

A Smarter App

Image: iRobot

The previous iteration of the iRobot app (and Roombas themselves) are built around one big fat CLEAN button. The new approach instead tries to figure out in much more detail where the robot should clean, and when, using a mixture of autonomous technology and interaction with the user.

The app update, which should be available starting today, is free. The scheduling and recommendations will work on every Roomba model, although for object recognition and anything related to mapping, you’ll need one of the more recent and fancier models with a camera. Future app updates will happen on a more aggressive schedule. Major app releases should happen every six months, with incremental updates happening even more frequently than that.

Angle also told us that overall, this change in direction also represents a substantial shift in resources for iRobot, and the company has pivoted two-thirds of its engineering organization to focus on software-based collaborative intelligence rather than hardware. “It’s not like we’re done doing hardware,” Angle assured us. “But we do think about hardware differently. We view our robots as platforms that have longer life cycles, and each platform will be able to support multiple generations of software. We’ve kind of decoupled robot intelligence from hardware, and that’s a change.”

Angle believes that working toward more intelligent collaboration between humans and robots is “the brave new frontier of artificial intelligence. I expect it to be the frontier for a reasonable amount of time to come,” he adds. “We have a lot of work to do to create the type of easy-to-use experience that consumer robots need.” Continue reading

Posted in Human Robots

#437707 Video Friday: This Robot Will Restock ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

CLAWAR 2020 – August 24-26, 2020 – [Online Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA
CYBATHLON 2020 – November 13-14, 2020 – [Online Event]
ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Tokyo startup Telexistence has recently unveiled a new robot called the Model-T, an advanced teleoperated humanoid that can use tools and grasp a wide range of objects. Japanese convenience store chain FamilyMart plans to test the Model-T to restock shelves in up to 20 stores by 2022. In the trial, a human “pilot” will operate the robot remotely, handling items like beverage bottles, rice balls, sandwiches, and bento boxes.

With Model-T and AWP, FamilyMart and TX aim to realize a completely new store operation by remoteizing and automating the merchandise restocking work, which requires a large number of labor-hours. As a result, stores can operate with less number of workers and enable them to recruit employees regardless of the store’s physical location.

[ Telexistence ]

Quadruped dance-off should be a new robotics competition at IROS or ICRA.

I dunno though, that moonwalk might keep Spot in the lead…

[ Unitree ]

Through a hybrid of simulation and real-life training, this air muscle robot is learning to play table tennis.

Table tennis requires to execute fast and precise motions. To gain precision it is necessary to explore in this high-speed regimes, however, exploration can be safety-critical at the same time. The combination of RL and muscular soft robots allows to close this gap. While robots actuated by pneumatic artificial muscles generate high forces that are required for e.g. smashing, they also offer safe execution of explosive motions due to antagonistic actuation.

To enable practical training without real balls, we introduce Hybrid Sim and Real Training (HYSR) that replays prerecorded real balls in simulation while executing actions on the real system. In this manner, RL can learn the challenging motor control of the PAM-driven robot while executing ~15000 hitting motions.

[ Max Planck Institute ]

Thanks Dieter!

Anthony Cowley wrote in to share his recent thesis work on UPSLAM, a fast and lightweight SLAM technique that records data in panoramic depth images (just PNGs) that are easy to visualize and even easier to share between robots, even on low-bandwidth networks.

[ UPenn ]

Thanks Anthony!

GITAI’s G1 is the space dedicated general-purpose robot. G1 robot will enable automation of various tasks internally & externally on space stations and for lunar base development.

[ Gitai ]

The University of Michigan has a fancy new treadmill that’s built right into the floor, which proves to be a bit much for Mini Cheetah.

But Cassie Blue won’t get stuck on no treadmill! She goes for a 0.3 mile walk across campus, which ends when a certain someone ran the gantry into Cassie Blue’s foot.

[ Michigan Robotics ]

Some serious quadruped research going on at UT Austin Human Centered Robotics Lab.

[ HCRL ]

Will Burrard-Lucas has spent lockdown upgrading his slightly indestructible BeetleCam wildlife photographing robot.

[ Will Burrard-Lucas ]

Teleoperated surgical robots are becoming commonplace in operating rooms, but many are massive (sometimes taking up an entire room) and are difficult to manipulate, especially if a complication arises and the robot needs to removed from the patient. A new collaboration between the Wyss Institute, Harvard University, and Sony Corporation has created the mini-RCM, a surgical robot the size of a tennis ball that weighs as much as a penny, and performed significantly better than manually operated tools in delicate mock-surgical procedures. Importantly, its small size means it is more comparable to the human tissues and structures on which it operates, and it can easily be removed by hand if needed.

[ Harvard Wyss ]

Yaskawa appears to be working on a robot that can scan you with a temperature gun and then jam a mask on your face?

[ Motoman ]

Maybe we should just not have people working in mines anymore, how about that?

[ Exyn ]

Many current human-robot interactive systems tend to use accurate and fast – but also costly – actuators and tracking systems to establish working prototypes that are safe to use and deploy for user studies. This paper presents an embedded framework to build a desktop space for human-robot interaction, using an open-source robot arm, as well as two RGB cameras connected to a Raspberry Pi-based controller that allow a fast yet low-cost object tracking and manipulation in 3D. We show in our evaluations that this facilitates prototyping a number of systems in which user and robot arm can commonly interact with physical objects.

[ Paper ]

IBM Research is proud to host professor Yoshua Bengio — one of the world’s leading experts in AI — in a discussion of how AI can contribute to the fight against COVID-19.

[ IBM Research ]

Ira Pastor, ideaXme life sciences ambassador interviews Professor Dr. Hiroshi Ishiguro, the Director of the Intelligent Robotics Laboratory, of the Department of Systems Innovation, in the Graduate School of Engineering Science, at Osaka University, Japan.

[ ideaXme ]

A CVPR talk from Stanford’s Chelsea Finn on “Generalization in Visuomotor Learning.”

[ Stanford ] Continue reading

Posted in Human Robots