Tag Archives: robotics

#435514 How Robotics Teams Prepared for ...

Roboticists share what they've learned so far from DARPA's Subterranean Challenge, and how they readied their bots for the next event—the Tunnel Circuit Continue reading

Posted in Human Robots

#435505 This Week’s Awesome Stories From ...

This Is the Computer You’ll Wear on Your Face in 10 Years
Mark Sullivan | Fast Company
“[Snap’s new Spectacles 3] foreshadow a device that many of us may wear as our primary personal computing device in about 10 years. Based on what I’ve learned by talking AR with technologists in companies big and small, here is what such a device might look like and do.”

These Robo-Shorts Are the Precursor to a True Robotic Exoskeleton
Devin Coldewey | TechCrunch
“The whole idea, then, is to leave behind the idea of an exosuit as a big mechanical thing for heavy industry or work, and bring in the idea that one could help an elderly person stand up from a chair, or someone recovering from an accident walk farther without fatigue.”

Artificial Tree Promises to Suck Up as Much Air Pollution as a Small Forest
Luke Dormehl | Digital Trends
“The company has developed an artificial tree that it claims is capable of sucking up the equivalent amount of air pollution as 368 living trees. That’s not only a saving on growing time, but also on the space needed to accommodate them.”

The Anthropocene Is a Joke
Peter Brannen | The Atlantic
“Unless we fast learn how to endure on this planet, and on a scale far beyond anything we’ve yet proved ourselves capable of, the detritus of civilization will be quickly devoured by the maw of deep time.”

DeepMind’s Losses and the Future of Artificial Intelligence
Gary Marcus | Wired
“Still, the rising magnitude of DeepMind’s losses is worth considering: $154 million in 2016, $341 million in 2017, $572 million in 2018. In my view, there are three central questions: Is DeepMind on the right track scientifically? Are investments of this magnitude sound from Alphabet’s perspective? And how will the losses affect AI in general?”

Image Credit: Tithi Luadthong / Shutterstock.com Continue reading

Posted in Human Robots

#435494 Driverless Electric Trucks Are Coming, ...

Self-driving and electric cars just don’t stop making headlines lately. Amazon invested in self-driving startup Aurora earlier this year. Waymo, Daimler, GM, along with startups like Zoox, have all launched or are planning to launch driverless taxis, many of them all-electric. People are even yanking driverless cars from their timeless natural habitat—roads—to try to teach them to navigate forests and deserts.

The future of driving, it would appear, is upon us.

But an equally important vehicle that often gets left out of the conversation is trucks; their relevance to our day-to-day lives may not be as visible as that of cars, but their impact is more profound than most of us realize.

Two recent developments in trucking point to a future of self-driving, electric semis hauling goods across the country, and likely doing so more quickly, cheaply, and safely than trucks do today.

Self-Driving in Texas
Last week, Kodiak Robotics announced it’s beginning its first commercial deliveries using self-driving trucks on a route from Dallas to Houston. The two cities sit about 240 miles apart, connected primarily by interstate 45. Kodiak is aiming to expand its reach far beyond the heart of Texas (if Dallas and Houston can be considered the heart, that is) to the state’s most far-flung cities, including El Paso to the west and Laredo to the south.

If self-driving trucks are going to be constrained to staying within state lines (and given that the laws regulating them differ by state, they will be for the foreseeable future), Texas is a pretty ideal option. It’s huge (thousands of miles of highway run both east-west and north-south), it’s warm (better than cold for driverless tech components like sensors), its proximity to Mexico means constant movement of both raw materials and manufactured goods (basically, you can’t have too many trucks in Texas), and most crucially, it’s lax on laws (driverless vehicles have been permitted there since 2017).

Spoiler, though—the trucks won’t be fully unmanned. They’ll have safety drivers to guide them onto and off of the highway, and to be there in case of any unexpected glitches.

California Goes (Even More) Electric
According to some top executives in the rideshare industry, automation is just one key component of the future of driving. Another is electricity replacing gas, and it’s not just carmakers that are plugging into the trend.

This week, Daimler Trucks North America announced completion of its first electric semis for customers Penske and NFI, to be used in the companies’ southern California operations. Scheduled to start operating later this month, the trucks will essentially be guinea pigs for testing integration of electric trucks into large-scale fleets; intel gleaned from the trucks’ performance will impact the design of later models.

Design-wise, the trucks aren’t much different from any other semi you’ve seen lumbering down the highway recently. Their range is about 250 miles—not bad if you think about how much more weight a semi is pulling than a passenger sedan—and they’ve been dubbed eCascadia, an electrified version of Freightliner’s heavy-duty Cascadia truck.

Batteries have a long way to go before they can store enough energy to make electric trucks truly viable (not to mention setting up a national charging infrastructure), but Daimler’s announcement is an important step towards an electrically-driven future.

Keep on Truckin’
Obviously, it’s more exciting to think about hailing one of those cute little Waymo cars with no steering wheel to shuttle you across town than it is to think about that 12-pack of toilet paper you ordered on Amazon cruising down the highway in a semi while the safety driver takes a snooze. But pushing driverless and electric tech in the trucking industry makes sense for a few big reasons.

Trucks mostly run long routes on interstate highways—with no pedestrians, stoplights, or other city-street obstacles to contend with, highway driving is much easier to automate. What glitches there are to be smoothed out may as well be smoothed out with cargo on board rather than people. And though you wouldn’t know it amid the frantic shouts of ‘a robot could take your job!’, the US is actually in the midst of a massive shortage of truck drivers—60,000 short as of earlier this year, to be exact.

As Todd Spencer, president of the Owner-Operator Independent Drivers Association, put it, “Trucking is an absolutely essential, critical industry to the nation, to everybody in it.” Alas, trucks get far less love than cars, but come on—probably 90 percent of the things you ate, bought, or used today were at some point moved by a truck.

Adding driverless and electric tech into that equation, then, should yield positive outcomes on all sides, whether we’re talking about cheaper 12-packs of toilet paper, fewer traffic fatalities due to human error, a less-strained labor force, a stronger economy… or something pretty cool to see as you cruise down the highway in your (driverless, electric, futuristic) car.

Image Credit: Vitpho / Shutterstock.com Continue reading

Posted in Human Robots

#435423 Moving Beyond Mind-Controlled Limbs to ...

Brain-machine interface enthusiasts often gush about “closing the loop.” It’s for good reason. On the implant level, it means engineering smarter probes that only activate when they detect faulty electrical signals in brain circuits. Elon Musk’s Neuralink—among other players—are readily pursuing these bi-directional implants that both measure and zap the brain.

But to scientists laboring to restore functionality to paralyzed patients or amputees, “closing the loop” has broader connotations. Building smart mind-controlled robotic limbs isn’t enough; the next frontier is restoring sensation in offline body parts. To truly meld biology with machine, the robotic appendage has to “feel one” with the body.

This month, two studies from Science Robotics describe complementary ways forward. In one, scientists from the University of Utah paired a state-of-the-art robotic arm—the DEKA LUKE—with electrically stimulating remaining nerves above the attachment point. Using artificial zaps to mimic the skin’s natural response patterns to touch, the team dramatically increased the patient’s ability to identify objects. Without much training, he could easily discriminate between the small and large and the soft and hard while blindfolded and wearing headphones.

In another, a team based at the National University of Singapore took inspiration from our largest organ, the skin. Mimicking the neural architecture of biological skin, the engineered “electronic skin” not only senses temperature, pressure, and humidity, but continues to function even when scraped or otherwise damaged. Thanks to artificial nerves that transmit signals far faster than our biological ones, the flexible e-skin shoots electrical data 1,000 times quicker than human nerves.

Together, the studies marry neuroscience and robotics. Representing the latest push towards closing the loop, they show that integrating biological sensibilities with robotic efficiency isn’t impossible (super-human touch, anyone?). But more immediately—and more importantly—they’re beacons of hope for patients who hope to regain their sense of touch.

For one of the participants, a late middle-aged man with speckled white hair who lost his forearm 13 years ago, superpowers, cyborgs, or razzle-dazzle brain implants are the last thing on his mind. After a barrage of emotionally-neutral scientific tests, he grasped his wife’s hand and felt her warmth for the first time in over a decade. His face lit up in a blinding smile.

That’s what scientists are working towards.

Biomimetic Feedback
The human skin is a marvelous thing. Not only does it rapidly detect a multitude of sensations—pressure, temperature, itch, pain, humidity—its wiring “binds” disparate signals together into a sensory fingerprint that helps the brain identify what it’s feeling at any moment. Thanks to over 45 miles of nerves that connect the skin, muscles, and brain, you can pick up a half-full coffee cup, knowing that it’s hot and sloshing, while staring at your computer screen. Unfortunately, this complexity is also why restoring sensation is so hard.

The sensory electrode array implanted in the participant’s arm. Image Credit: George et al., Sci. Robot. 4, eaax2352 (2019)..
However, complex neural patterns can also be a source of inspiration. Previous cyborg arms are often paired with so-called “standard” sensory algorithms to induce a basic sense of touch in the missing limb. Here, electrodes zap residual nerves with intensities proportional to the contact force: the harder the grip, the stronger the electrical feedback. Although seemingly logical, that’s not how our skin works. Every time the skin touches or leaves an object, its nerves shoot strong bursts of activity to the brain; while in full contact, the signal is much lower. The resulting electrical strength curve resembles a “U.”

The LUKE hand. Image Credit: George et al., Sci. Robot. 4, eaax2352 (2019).
The team decided to directly compare standard algorithms with one that better mimics the skin’s natural response. They fitted a volunteer with a robotic LUKE arm and implanted an array of electrodes into his forearm—right above the amputation—to stimulate the remaining nerves. When the team activated different combinations of electrodes, the man reported sensations of vibration, pressure, tapping, or a sort of “tightening” in his missing hand. Some combinations of zaps also made him feel as if he were moving the robotic arm’s joints.

In all, the team was able to carefully map nearly 120 sensations to different locations on the phantom hand, which they then overlapped with contact sensors embedded in the LUKE arm. For example, when the patient touched something with his robotic index finger, the relevant electrodes sent signals that made him feel as if he were brushing something with his own missing index fingertip.

Standard sensory feedback already helped: even with simple electrical stimulation, the man could tell apart size (golf versus lacrosse ball) and texture (foam versus plastic) while blindfolded and wearing noise-canceling headphones. But when the team implemented two types of neuromimetic feedback—electrical zaps that resembled the skin’s natural response—his performance dramatically improved. He was able to identify objects much faster and more accurately under their guidance. Outside the lab, he also found it easier to cook, feed, and dress himself. He could even text on his phone and complete routine chores that were previously too difficult, such as stuffing an insert into a pillowcase, hammering a nail, or eating hard-to-grab foods like eggs and grapes.

The study shows that the brain more readily accepts biologically-inspired electrical patterns, making it a relatively easy—but enormously powerful—upgrade that seamlessly integrates the robotic arms with the host. “The functional and emotional benefits…are likely to be further enhanced with long-term use, and efforts are underway to develop a portable take-home system,” the team said.

E-Skin Revolution: Asynchronous Coded Electronic Skin (ACES)
Flexible electronic skins also aren’t new, but the second team presented an upgrade in both speed and durability while retaining multiplexed sensory capabilities.

Starting from a combination of rubber, plastic, and silicon, the team embedded over 200 sensors onto the e-skin, each capable of discerning contact, pressure, temperature, and humidity. They then looked to the skin’s nervous system for inspiration. Our skin is embedded with a dense array of nerve endings that individually transmit different types of sensations, which are integrated inside hubs called ganglia. Compared to having every single nerve ending directly ping data to the brain, this “gather, process, and transmit” architecture rapidly speeds things up.

The team tapped into this biological architecture. Rather than pairing each sensor with a dedicated receiver, ACES sends all sensory data to a single receiver—an artificial ganglion. This setup lets the e-skin’s wiring work as a whole system, as opposed to individual electrodes. Every sensor transmits its data using a characteristic pulse, which allows it to be uniquely identified by the receiver.

The gains were immediate. First was speed. Normally, sensory data from multiple individual electrodes need to be periodically combined into a map of pressure points. Here, data from thousands of distributed sensors can independently go to a single receiver for further processing, massively increasing efficiency—the new e-skin’s transmission rate is roughly 1,000 times faster than that of human skin.

Second was redundancy. Because data from individual sensors are aggregated, the system still functioned even when any individual receptors are damaged, making it far more resilient than previous attempts. Finally, the setup could easily scale up. Although the team only tested the idea with 240 sensors, theoretically the system should work with up to 10,000.

The team is now exploring ways to combine their invention with other material layers to make it water-resistant and self-repairable. As you might’ve guessed, an immediate application is to give robots something similar to complex touch. A sensory upgrade not only lets robots more easily manipulate tools, doorknobs, and other objects in hectic real-world environments, it could also make it easier for machines to work collaboratively with humans in the future (hey Wall-E, care to pass the salt?).

Dexterous robots aside, the team also envisions engineering better prosthetics. When coated onto cyborg limbs, for example, ACES may give them a better sense of touch that begins to rival the human skin—or perhaps even exceed it.

Regardless, efforts that adapt the functionality of the human nervous system to machines are finally paying off, and more are sure to come. Neuromimetic ideas may very well be the link that finally closes the loop.

Image Credit: Dan Hixson/University of Utah College of Engineering.. Continue reading

Posted in Human Robots

#435370 The Rise of the Robots. Soon!

Are we the masters of our own eventual demise at the…hand of our robot creations? From where I’m standing, it sure looks like it! Related PostsYour Next Pilot Could Be Drone SoftwareWould you get on a plane that didn’t have … Continue reading

Posted in Human Robots