Tag Archives: something

#437763 Peer Review of Scholarly Research Gets ...

In the world of academics, peer review is considered the only credible validation of scholarly work. Although the process has its detractors, evaluation of academic research by a cohort of contemporaries has endured for over 350 years, with “relatively minor changes.” However, peer review may be set to undergo its biggest revolution ever—the integration of artificial intelligence.

Open-access publisher Frontiers has debuted an AI tool called the Artificial Intelligence Review Assistant (AIRA), which purports to eliminate much of the grunt work associated with peer review. Since the beginning of June 2020, every one of the 11,000-plus submissions Frontiers received has been run through AIRA, which is integrated into its collaborative peer-review platform. This also makes it accessible to external users, accounting for some 100,000 editors, authors, and reviewers. Altogether, this helps “maximize the efficiency of the publishing process and make peer-review more objective,” says Kamila Markram, founder and CEO of Frontiers.

AIRA’s interactive online platform, which is a first of its kind in the industry, has been in development for three years.. It performs three broad functions, explains Daniel Petrariu, director of project management: assessing the quality of the manuscript, assessing quality of peer review, and recommending editors and reviewers. At the initial validation stage, the AI can make up to 20 recommendations and flag potential issues, including language quality, plagiarism, integrity of images, conflicts of interest, and so on. “This happens almost instantly and with [high] accuracy, far beyond the rate at which a human could be expected to complete a similar task,” Markram says.

“We have used a wide variety of machine-learning models for a diverse set of applications, including computer vision, natural language processing, and recommender systems,” says Markram. This includes simple bag-of-words models, as well as more sophisticated deep-learning ones. AIRA also leverages a large knowledge base of publications and authors.

Markram notes that, to address issues of possible AI bias, “We…[build] our own datasets and [design] our own algorithms. We make sure no statistical biases appear in the sampling of training and testing data. For example, when building a model to assess language quality, scientific fields are equally represented so the model isn’t biased toward any specific topic.” Machine- and deep-learning approaches, along with feedback from domain experts, including errors, are captured and used as additional training data. “By regularly re-training, we make sure our models improve in terms of accuracy and stay up-to-date.”

The AI’s job is to flag concerns; humans take the final decisions, says Petrariu. As an example, he cites image manipulation detection—something AI is super-efficient at but is nearly impossible for a human to perform with the same accuracy. “About 10 percent of our flagged images have some sort of problem,” he adds. “[In academic publishing] nobody has done this kind of comprehensive check [using AI] before,” says Petrariu. AIRA, he adds, facilitates Frontiers’ mission to make science open and knowledge accessible to all. Continue reading

Posted in Human Robots

#437758 Remotely Operated Robot Takes Straight ...

Roboticists love hard problems. Challenges like the DRC and SubT have helped (and are still helping) to catalyze major advances in robotics, but not all hard problems require a massive amount of DARPA funding—sometimes, a hard problem can just be something very specific that’s really hard for a robot to do, especially relative to the ease with which a moderately trained human might be able to do it. Catching a ball. Putting a peg in a hole. Or using a straight razor to shave someone’s face without Sweeney Todd-izing them.

This particular roboticist who sees straight-razor face shaving as a hard problem that robots should be solving is John Peter Whitney, who we first met back at IROS 2014 in Chicago when (working at Disney Research) he introduced an elegant fluidic actuator system. These actuators use tubes containing a fluid (like air or water) to transmit forces from a primary robot to a secondary robot in a very efficient way that also allows for either compliance or very high fidelity force feedback, depending on the compressibility of the fluid.

Photo: John Peter Whitney/Northeastern University

Barber meets robot: Boston based barber Jesse Cabbage [top, right] observes the machine created by roboticist John Peter Whitney. Before testing the robot on Whitney’s face, they used his arm for a quick practice [bottom].

Whitney is now at Northeastern University, in Boston, and he recently gave a talk at the RSS workshop on “Reacting to Contact,” where he suggested that straight razor shaving would be an interesting and valuable problem for robotics to work toward, due to its difficulty and requirement for an extremely high level of both performance and reliability.

Now, a straight razor is sort of like a safety razor, except with the safety part removed, which in fact does make it significantly less safe for humans, much less robots. Also not ideal for those worried about safety is that as part of the process the razor ends up in distressingly close proximity to things like the artery that is busily delivering your brain’s entire supply of blood, which is very close to the top of the list of things that most people want to keep blades very far away from. But that didn’t stop Whitney from putting his whiskers where his mouth is and letting his robotic system mediate the ministrations of a professional barber. It’s not an autonomous robotic straight-razor shave (because Whitney is not totally crazy), but it’s a step in that direction, and requires that the hardware Whitney developed be dead reliable.

Perhaps that was a poor choice of words. But, rest assured that Whitney lived long enough to answer our questions after. Here’s the video; it’s part of a longer talk, but it should start in the right spot, at about 23:30.

If Whitney looked a little bit nervous to you, that’s because he was. “This was the first time I’d ever been shaved by someone (something?!) else with a straight razor,” he told us, and while having a professional barber at the helm was some comfort, “the lack of feeling and control on my part was somewhat unsettling.” Whitney says that the barber, Jesse Cabbage of Dentes Barbershop in Somerville, Mass., was surprised by how well he could feel the tactile sensations being transmitted from the razor. “That’s one of the reasons we decided to make this video,” Whitney says. “I can’t show someone how something feels, so the next best thing is to show a delicate task that either from experience or intuition makes it clear to the viewer that the system must have these properties—otherwise the task wouldn’t be possible.”

And as for when Whitney might be comfortable getting shaved by a robotic system without a human in the loop? It’s going to take a lot of work, as do most other hard problems in robotics. “There are two parts to this,” he explains. “One is fault-tolerance of the components themselves (software, electronics, etc.) and the second is the quality of the perception and planning algorithms.”

He offers a comparison to self-driving cars, in which similar (or greater) risks are incurred: “To learn how to perceive, interpret, and adapt, we need a very high-fidelity model of the problem, or a wealth of data and experience, or both” he says. “But in the case of shaving we are greatly lacking in both!” He continues with the analogy: “I think there is a natural progression—the community started with autonomous driving of toy cars on closed courses and worked up to real cars carrying human passengers; in robotic manipulation we are beginning to move out of the ‘toy car’ stage and so I think it’s good to target high-consequence hard problems to help drive progress.”

The ultimate goal is much more general than the creation of a dedicated straight razor shaving robot. This particular hardware system is actually a testbed for exploring MRI-compatible remote needle biopsy.

Of course, the ultimate goal here is much more general than the creation of a dedicated straight razor shaving robot; it’s a challenge that includes a host of sub-goals that will benefit robotics more generally. This particular hardware system Whitney is developing is actually a testbed for exploring MRI-compatible remote needle biopsy, and he and his students are collaborating with Brigham and Women’s Hospital in Boston on adapting this technology to prostate biopsy and ablation procedures. They’re also exploring how delicate touch can be used as a way to map an environment and localize within it, especially where using vision may not be a good option. “These traits and behaviors are especially interesting for applications where we must interact with delicate and uncertain environments,” says Whitney. “Medical robots, assistive and rehabilitation robots and exoskeletons, and shared-autonomy teleoperation for delicate tasks.”
A paper with more details on this robotic system, “Series Elastic Force Control for Soft Robotic Fluid Actuators,” is available on arXiv. Continue reading

Posted in Human Robots

#437753 iRobot’s New Education Robot Makes ...

iRobot has been on a major push into education robots recently. They acquired Root Robotics in 2019, and earlier this year, launched an online simulator and associated curriculum designed to work in tandem with physical Root robots. The original Root was intended to be a classroom robot, with one of its key features being the ability to stick to (and operate on) magnetic virtual surfaces, like whiteboards. And as a classroom robot, at $200, it’s relatively affordable, if you can buy one or two and have groups of kids share them.

For kids who are more focused on learning at home, though, $200 is a lot for a robot that doesn't even keep your floors clean. And as nice as it is to have a free simulator, any kid will tell you that it’s way cooler to have a real robot to mess around with. Today, iRobot is announcing a new version of Root that’s been redesigned for home use, with a $129 price that makes it significantly more accessible to folks outside of the classroom.

The Root rt0 is a second version of the Root robot—the more expensive, education-grade Root rt1 is still available. To bring the cost down, the rt0 is missing some features that you can still find in the rt1. Specifically, you don’t get the internal magnets to stick the robot to vertical surfaces, there are no cliff sensors, and you don’t get a color scanner or an eraser. But for home use, the internal magnets are probably not necessary anyway, and the rest of that stuff seems like a fair compromise for a cost reduction of 30 percent.

Photo: iRobot

One of the new accessories for the iRobot Root rt0 is a “Brick Top” that snaps onto the upper face the robot via magnets. The accessory can be used with LEGOs and other LEGO-compatible bricks, opening up an enormous amount of customization.

It’s not all just taking away, though. There’s also a new $20 accessory, a LEGO-ish “Brick Top” that snaps onto the upper face of Root (either version) via magnets. The plate can be used with LEGO bricks and other LEGO-compatible things. This opens up an enormous amount of customization, and it’s for more than just decoration, since Root rt0 has the ability to interact with whatever’s on top of it via its actuated marker. Root can move the marker up and down, the idea being that you can programmatically turn lines on and off. By replacing the marker with a plastic thingy that sticks up through the body of the robot, the marker up/down command can be used to actuate something on the brick top. In the video, that’s what triggers the catapult.

Photo: iRobot

By attaching a marker, you can program Root to draw. The robot has a motor that can move the marker up and down.

This less expensive version of Root still has access to the online simulator, as well as the multi-level coding interface that allows kids to seamlessly transition through multiple levels of coding complexity, from graphical to text. There’s a new Android app coming out today, and you can access everything through web-based apps on Chrome OS, Windows and macOS, as well as on iOS. iRobot tells us that they’ve also recently expanded their online learning library full of Root-based educational activities. In particular, they’ve added a new category on “Social Emotional Learning,” the goal of which is to help kids develop things like social awareness, self-management, decision making, and relationship skills. We’re not quite sure how you teach those things with a little hexagonal robot, but we like that iRobot is giving it a try.

Root coding robots are designed for kids age 6 and up, ships for free, and is available now.

[ iRobot Root ] Continue reading

Posted in Human Robots

#437749 Video Friday: NASA Launches Its Most ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AWS Cloud Robotics Summit – August 18-19, 2020 – [Virtual Conference]
CLAWAR 2020 – August 24-26, 2020 – [Virtual Conference]
ICUAS 2020 – September 1-4, 2020 – Athens, Greece
ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan
AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Virtual Conference]
IROS 2020 – October 25-29, 2020 – Las Vegas, Nevada
ICSR 2020 – November 14-16, 2020 – Golden, Colorado
Let us know if you have suggestions for next week, and enjoy today’s videos.

Yesterday was a big day for what was quite possibly the most expensive robot on Earth up until it wasn’t on Earth anymore.

Perseverance and the Ingenuity helicopter are expected to arrive on Mars early next year.

[ JPL ]

ICYMI, our most popular post this week featured Northeastern University roboticist John Peter Whitney literally putting his neck on the line for science! He was testing a remotely operated straight razor shaving robotic system powered by fluidic actuators. The cutting-edge (sorry!) device transmits forces from a primary stage, operated by a barber, to a secondary stage, with the razor attached.

[ John Peter Whitney ]

Together with Boston Dynamics, Ford is introducing a pilot program into our Van Dyke Transmission Plant. Say hello to Fluffy the Robot Dog, who creates fast and accurate 3D scans that helps Ford engineers when we’re retooling our plants.

Not shown in the video: “At times, Fluffy sits on its robotic haunches and rides on the back of a small, round Autonomous Mobile Robot, known informally as Scouter. Scouter glides smoothly up and down the aisles of the plant, allowing Fluffy to conserve battery power until it’s time to get to work. Scouter can autonomously navigate facilities while scanning and capturing 3-D point clouds to generate a CAD of the facility. If an area is too tight for Scouter, Fluffy comes to the rescue.”

[ Ford ]

There is a thing that happens at 0:28 in this video that I have questions about.

[ Ghost Robotics ]

Pepper is far more polite about touching than most humans.

[ Paper ]

We don’t usually post pure simulation videos unless they give us something to get really, really excited about. So here’s a pure simulation video.

[ Hybrid Robotics ]

University of Michigan researchers are developing new origami inspired methods for designing, fabricating and actuating micro-robots using heat.These improvements will expand the mechanical capabilities of the tiny bots, allowing them to fold into more complex shapes.

[ DRSL ]

HMI is making beastly electric arms work underwater, even if they’re not stapled to a robotic submarine.

[ HMI ]

Here’s some interesting work in progress from MIT’s Biomimetics Robotics Lab. The limb is acting as a “virtual magnet” using a bimodal force and direction sensor.

Thanks Peter!

[ MIT Biomimetics Lab ]

This is adorable but as a former rabbit custodian I can assure you that approximately 3 seconds after this video ended, all of the wires on that robot were chewed to bits.

[ Lingkang Zhang ]

During the ARCHE 2020 integration week, TNO and the ETH Robot System Lab (RSL) collaborated to integrate their research and development process using the Articulated Locomotion and MAnipulation (ALMA) robot. Next to the integration of software, we tested software to confirm proper implementation and development. We also captured visual and auditory data for future software development. This all resulted in the creation of multiple demo’s to show the capabilities of the teleoperation framework using the ALMA robot.

[ RSL ]

When we talk about practical applications quadrupedal robots with foot wheels, we don’t usually think about them on this scale, although we should.

[ RSL ]

Juan wrote in to share a DIY quadruped that he’s been working on, named CHAMP.

Juan says that the demo robot can be built in less than US $1000 with easily accessible parts. “I hope that my project can provide a more accessible platform for students, researchers, and enthusiasts who are interested to learn more about quadrupedal robot development and its underlying technology.”

[ CHAMP ]

Thanks Juan!

Here’s a New Zealand TV report about a study on robot abuse from Christoph Bartneck at the University of Canterbury.

[ Paper ]

Our Robotics Studio is a hands on class exposing students to practical aspects of the design, fabrication, and programming of physical robotic systems. So what happens when the class goes virtual due to the covid-19 virus? Things get physical — all @ home.

[ Columbia ]

A few videos from the Supernumerary Robotic Devices Workshop, held online earlier this month.

“Handheld Robots: Bridging the Gap between Fully External and Wearable Robots,” presented by Walterio Mayol-Cuevas, University of Bristol.

“Playing the Piano with 11 Fingers: The Neurobehavioural Constraints of Human Robot Augmentation,” presented by Aldo Faisal, Imperial College London.

[ Workshop ] Continue reading

Posted in Human Robots

#437728 A Battery That’s Tough Enough To ...

Batteries can add considerable mass to any design, and they have to be supported using a sufficiently strong structure, which can add significant mass of its own. Now researchers at the University of Michigan have designed a structural zinc-air battery, one that integrates directly into the machine that it powers and serves as a load-bearing part.

That feature saves weight and thus increases effective storage capacity, adding to the already hefty energy density of the zinc-air chemistry. And the very elements that make the battery physically strong help contain the chemistry’s longstanding tendency to degrade over many hundreds of charge-discharge cycles.

The research is being published today in Science Robotics.

Nicholas Kotov, a professor of chemical engineer, is the leader of the project. He would not say how many watt-hours his prototype stores per gram, but he did note that zinc air—because it draw on ambient air for its electricity-producing reactions—is inherently about three times as energy-dense as lithium-ion cells. And, because using the battery as a structural part means dispensing with an interior battery pack, you could free up perhaps 20 percent of a machine’s interior. Along with other factors the new battery could in principle provide as much as 72 times the energy per unit of volume (not of mass) as today’s lithium-ion workhorses.

Illustration: Alice Kitterman/Science Robotics

“It’s not as if we invented something that was there before us,” Kotov says. ”I look in the mirror and I see my layer of fat—that’s for the storage of energy, but it also serves other purposes,” like keeping you warm in the wintertime. (A similar advance occurred in rocketry when designers learned how to make some liquid propellant tanks load bearing, eliminating the mass penalty of having separate external hull and internal tank walls.)

Others have spoken of putting batteries, including the lithium-ion kind, into load-bearing parts in vehicles. Ford, BMW, and Airbus, for instance, have expressed interest in the idea. The main problem to overcome is the tradeoff in load-bearing batteries between electrochemical performance and mechanical strength.

Image: Kotov Lab/University of Michigan

Key to the battery's physical toughness and to its long life cycle is the nanofiber membrane, made of Kevlar.

The Michigan group get both qualities by using a solid electrolyte (which can’t leak under stress) and by covering the electrodes with a membrane whose nanostructure of fibers is derived from Kevlar. That makes the membrane tough enough to suppress the growth of dendrites—branching fibers of metal that tend to form on an electrode with every charge-discharge cycle and which degrade the battery.

The Kevlar need not be purchased new but can be salvaged from discarded body armor. Other manufacturing steps should be easy, too, Kotov says. He has only just begun to talk to potential commercial partners, but he says there’s no reason why his battery couldn’t hit the market in the next three or four years.

Drones and other autonomous robots might be the most logical first application because their range is so severely chained to their battery capacity. Also, because such robots don’t carry people about, they face less of a hurdle from safety regulators leery of a fundamentally new battery type.

“And it’s not just about the big Amazon robots but also very small ones,” Kotov says. “Energy storage is a very significant issue for small and flexible soft robots.”

Here’s a video showing how Kotov’s lab has used batteries to form the “exoskeleton” of robots that scuttle like worms or scorpions. Continue reading

Posted in Human Robots