Tag Archives: second
#437758 Remotely Operated Robot Takes Straight ...
Roboticists love hard problems. Challenges like the DRC and SubT have helped (and are still helping) to catalyze major advances in robotics, but not all hard problems require a massive amount of DARPA funding—sometimes, a hard problem can just be something very specific that’s really hard for a robot to do, especially relative to the ease with which a moderately trained human might be able to do it. Catching a ball. Putting a peg in a hole. Or using a straight razor to shave someone’s face without Sweeney Todd-izing them.
This particular roboticist who sees straight-razor face shaving as a hard problem that robots should be solving is John Peter Whitney, who we first met back at IROS 2014 in Chicago when (working at Disney Research) he introduced an elegant fluidic actuator system. These actuators use tubes containing a fluid (like air or water) to transmit forces from a primary robot to a secondary robot in a very efficient way that also allows for either compliance or very high fidelity force feedback, depending on the compressibility of the fluid.
Photo: John Peter Whitney/Northeastern University
Barber meets robot: Boston based barber Jesse Cabbage [top, right] observes the machine created by roboticist John Peter Whitney. Before testing the robot on Whitney’s face, they used his arm for a quick practice [bottom].
Whitney is now at Northeastern University, in Boston, and he recently gave a talk at the RSS workshop on “Reacting to Contact,” where he suggested that straight razor shaving would be an interesting and valuable problem for robotics to work toward, due to its difficulty and requirement for an extremely high level of both performance and reliability.
Now, a straight razor is sort of like a safety razor, except with the safety part removed, which in fact does make it significantly less safe for humans, much less robots. Also not ideal for those worried about safety is that as part of the process the razor ends up in distressingly close proximity to things like the artery that is busily delivering your brain’s entire supply of blood, which is very close to the top of the list of things that most people want to keep blades very far away from. But that didn’t stop Whitney from putting his whiskers where his mouth is and letting his robotic system mediate the ministrations of a professional barber. It’s not an autonomous robotic straight-razor shave (because Whitney is not totally crazy), but it’s a step in that direction, and requires that the hardware Whitney developed be dead reliable.
Perhaps that was a poor choice of words. But, rest assured that Whitney lived long enough to answer our questions after. Here’s the video; it’s part of a longer talk, but it should start in the right spot, at about 23:30.
If Whitney looked a little bit nervous to you, that’s because he was. “This was the first time I’d ever been shaved by someone (something?!) else with a straight razor,” he told us, and while having a professional barber at the helm was some comfort, “the lack of feeling and control on my part was somewhat unsettling.” Whitney says that the barber, Jesse Cabbage of Dentes Barbershop in Somerville, Mass., was surprised by how well he could feel the tactile sensations being transmitted from the razor. “That’s one of the reasons we decided to make this video,” Whitney says. “I can’t show someone how something feels, so the next best thing is to show a delicate task that either from experience or intuition makes it clear to the viewer that the system must have these properties—otherwise the task wouldn’t be possible.”
And as for when Whitney might be comfortable getting shaved by a robotic system without a human in the loop? It’s going to take a lot of work, as do most other hard problems in robotics. “There are two parts to this,” he explains. “One is fault-tolerance of the components themselves (software, electronics, etc.) and the second is the quality of the perception and planning algorithms.”
He offers a comparison to self-driving cars, in which similar (or greater) risks are incurred: “To learn how to perceive, interpret, and adapt, we need a very high-fidelity model of the problem, or a wealth of data and experience, or both” he says. “But in the case of shaving we are greatly lacking in both!” He continues with the analogy: “I think there is a natural progression—the community started with autonomous driving of toy cars on closed courses and worked up to real cars carrying human passengers; in robotic manipulation we are beginning to move out of the ‘toy car’ stage and so I think it’s good to target high-consequence hard problems to help drive progress.”
The ultimate goal is much more general than the creation of a dedicated straight razor shaving robot. This particular hardware system is actually a testbed for exploring MRI-compatible remote needle biopsy.
Of course, the ultimate goal here is much more general than the creation of a dedicated straight razor shaving robot; it’s a challenge that includes a host of sub-goals that will benefit robotics more generally. This particular hardware system Whitney is developing is actually a testbed for exploring MRI-compatible remote needle biopsy, and he and his students are collaborating with Brigham and Women’s Hospital in Boston on adapting this technology to prostate biopsy and ablation procedures. They’re also exploring how delicate touch can be used as a way to map an environment and localize within it, especially where using vision may not be a good option. “These traits and behaviors are especially interesting for applications where we must interact with delicate and uncertain environments,” says Whitney. “Medical robots, assistive and rehabilitation robots and exoskeletons, and shared-autonomy teleoperation for delicate tasks.”
A paper with more details on this robotic system, “Series Elastic Force Control for Soft Robotic Fluid Actuators,” is available on arXiv. Continue reading
#437753 iRobot’s New Education Robot Makes ...
iRobot has been on a major push into education robots recently. They acquired Root Robotics in 2019, and earlier this year, launched an online simulator and associated curriculum designed to work in tandem with physical Root robots. The original Root was intended to be a classroom robot, with one of its key features being the ability to stick to (and operate on) magnetic virtual surfaces, like whiteboards. And as a classroom robot, at $200, it’s relatively affordable, if you can buy one or two and have groups of kids share them.
For kids who are more focused on learning at home, though, $200 is a lot for a robot that doesn't even keep your floors clean. And as nice as it is to have a free simulator, any kid will tell you that it’s way cooler to have a real robot to mess around with. Today, iRobot is announcing a new version of Root that’s been redesigned for home use, with a $129 price that makes it significantly more accessible to folks outside of the classroom.
The Root rt0 is a second version of the Root robot—the more expensive, education-grade Root rt1 is still available. To bring the cost down, the rt0 is missing some features that you can still find in the rt1. Specifically, you don’t get the internal magnets to stick the robot to vertical surfaces, there are no cliff sensors, and you don’t get a color scanner or an eraser. But for home use, the internal magnets are probably not necessary anyway, and the rest of that stuff seems like a fair compromise for a cost reduction of 30 percent.
Photo: iRobot
One of the new accessories for the iRobot Root rt0 is a “Brick Top” that snaps onto the upper face the robot via magnets. The accessory can be used with LEGOs and other LEGO-compatible bricks, opening up an enormous amount of customization.
It’s not all just taking away, though. There’s also a new $20 accessory, a LEGO-ish “Brick Top” that snaps onto the upper face of Root (either version) via magnets. The plate can be used with LEGO bricks and other LEGO-compatible things. This opens up an enormous amount of customization, and it’s for more than just decoration, since Root rt0 has the ability to interact with whatever’s on top of it via its actuated marker. Root can move the marker up and down, the idea being that you can programmatically turn lines on and off. By replacing the marker with a plastic thingy that sticks up through the body of the robot, the marker up/down command can be used to actuate something on the brick top. In the video, that’s what triggers the catapult.
Photo: iRobot
By attaching a marker, you can program Root to draw. The robot has a motor that can move the marker up and down.
This less expensive version of Root still has access to the online simulator, as well as the multi-level coding interface that allows kids to seamlessly transition through multiple levels of coding complexity, from graphical to text. There’s a new Android app coming out today, and you can access everything through web-based apps on Chrome OS, Windows and macOS, as well as on iOS. iRobot tells us that they’ve also recently expanded their online learning library full of Root-based educational activities. In particular, they’ve added a new category on “Social Emotional Learning,” the goal of which is to help kids develop things like social awareness, self-management, decision making, and relationship skills. We’re not quite sure how you teach those things with a little hexagonal robot, but we like that iRobot is giving it a try.
Root coding robots are designed for kids age 6 and up, ships for free, and is available now.
[ iRobot Root ] Continue reading
#437741 CaseCrawler Adds Tiny Robotic Legs to ...
Most of us have a fairly rational expectation that if we put our cellphone down somewhere, it will stay in that place until we pick it up again. Normally, this is exactly what you’d want, but there are exceptions, like when you put your phone down in not quite the right spot on a wireless charging pad without noticing, or when you’re lying on the couch and your phone is juuust out of reach no matter how much you stretch.
Roboticists from the Biorobotics Laboratory at Seoul National University in South Korea have solved both of these problems, and many more besides, by developing a cellphone case with little robotic legs, endowing your phone with the ability to skitter around autonomously. And unlike most of the phone-robot hybrids we’ve seen in the past, this one actually does look like a legit case for your phone.
CaseCrawler is much chunkier than a form-fitting case, but it’s not offensively bigger than one of those chunky battery cases. It’s only 24 millimeters thick (excluding the motor housing), and the total weight is just under 82 grams. Keep in mind that this case is in fact an entire robot, and also not at all optimized for being an actual phone case, so it’s easy to imagine how it could get a lot more svelte—for example, it currently includes a small battery that would be unnecessary if it instead tapped into the phone for power.
The technology inside is pretty amazing, since it involves legs that can retract all the way flat while also supporting a significant amount of weight. The legs work sort of like your legs do, in that there’s a knee joint that can only bend one way. To move the robot forward, a linkage (attached to a motor through a gearbox) pushes the leg back against the ground, as the knee joint keeps the leg straight. On the return stroke, the joint allows the leg to fold, making it compliant so that it doesn’t exert force on the ground. The transmission that sends power from the gearbox to the legs is just 1.5-millimeter thick, but this incredibly thin and lightweight mechanical structure is quite powerful. A non-phone case version of the robot, weighing about 23 g, is able to crawl at 21 centimeters per second while carrying a payload of just over 300 g. That’s more than 13 times its body weight.
The researchers plan on exploring how robots like these could make other objects movable that would otherwise not be. They’d also like to add some autonomy, which (at least for the phone case version) could be as straightforward as leveraging the existing sensors on the phone. And as to when you might be able to buy one of these—we’ll keep you updated, but the good news is that it seems to be fundamentally inexpensive enough that it may actually crawl out of the lab one day.
“CaseCrawler: A Lightweight and Low-Profile Crawling Phone Case Robot,” by Jongeun Lee, Gwang-Pil Jung, Sang-Min Baek, Soo-Hwan Chae, Sojung Yim, Woongbae Kim, and Kyu-Jin Cho from Seoul National University, appears in the October issue of IEEE Robotics and Automation Letters.
< Back to IEEE Journal Watch Continue reading