Tag Archives: intelligent

#439349 The Four Stages of Intelligent Matter ...

Imagine clothing that can warm or cool you, depending on how you’re feeling. Or artificial skin that responds to touch, temperature, and wicks away moisture automatically. Or cyborg hands controlled with DNA motors that can adjust based on signals from the outside world.

Welcome to the era of intelligent matter—an unconventional AI computing idea directly woven into the fabric of synthetic matter. Powered by brain-based computing, these materials can weave the skins of soft robots or form microswarms of drug-delivering nanobots, all while reserving power as they learn and adapt.

Sound like sci-fi? It gets weirder. The crux that’ll guide us towards intelligent matter, said Dr. W.H.P. Pernice at the University of Munster and colleagues, is a distributed “brain” across the material’s “body”— far more alien than the structure of our own minds.

Picture a heated blanket. Rather than powering it with a single controller, it’ll have computing circuits sprinkled all over. This computing network can then tap into a type of brain-like process, called “neuromorphic computing.” This technological fairy dust then transforms a boring blanket into one that learns what temperature you like and at what times of the day to predict your preferences as a new season rolls around.

Oh yeah, and if made from nano-sized building blocks, it could also reshuffle its internal structure to store your info with a built-in memory.

“The long-term goal is de-centralized neuromorphic computing,” said Pernice. Taking inspiration from nature, we can then begin to engineer matter that’s powered by brain-like hardware, running AI across the entire material.

In other words: Iron Man’s Endgame nanosuit? Here we come.

Why Intelligent Matter?
From rockets that could send us to Mars to a plain cotton T-shirt, we’ve done a pretty good job using materials we either developed or harvested. But that’s all they are—passive matter.

In contrast, nature is rich with intelligent matter. Take human skin. It’s waterproof, only selectively allows some molecules in, and protects us from pressure, friction, and most bacteria and viruses. It can also heal itself after a scratch or rip, and it senses outside temperature to cool us down when it gets too hot.

While our skin doesn’t “think” in the traditional sense, it can shuttle information to the brain in a blink. Then the magic happens. With over 100 billion neurons, the brain can run massively parallel computations in its circuits, while consuming only about 20 watts—not too different from the 13” Macbook Pro I’m currently typing on. Why can’t a material do the same?

The problem is that our current computing architecture struggles to support brain-like computing because of energy costs and time lags.

Enter neuromorphic computing. It’s an idea that hijacks the brain’s ability to process data simultaneously with minimal energy. To get there, scientists are redesigning computer chips from the ground up. For example, instead of today’s chips that divorce computing modules from memory modules, these chips process information and store it at the same location. It might seem weird, but it’s what our brains do when learning and storing new information. This arrangement slashes the need for wires between memory and computation modules, essentially teleporting information rather than sending it down a traffic-jammed cable.

The end result is massively parallel computing at a very low energy cost.

The Road to Intelligent Matter
In Pernice and his colleagues’ opinion, there are four stages that can get us to intelligent matter.

The first is structural—basically your run-of-the-mill matter that can be complex but can’t change its properties. Think 3D printed frames of a lung or other organs. Intricate, but not adaptable.

Next is responsive matter. This can shift its makeup in response to the environment. Similar to an octopus changing its skin color to hide from predators, these materials can change their shape, color, or stiffness. One example is a 3D printed sunflower embedded with sensors that blossoms or closes depending on heat, force, and light. Another is responsive soft materials that can stretch and plug into biological systems, such as an artificial muscle made of silicon that can stretch and lift over 13 pounds repeatedly upon heating. While it’s a neat trick, it doesn’t adapt and can only follow its pre-programmed fate.

Higher up the intelligence food chain are adaptive materials. These have a built-in network to process information, temporarily store it, and adjust behavior from that feedback. One example are micro-swarms of tiny robots that move in a coordinated way, similar to schools of fish or birds. But because their behavior is also pre-programmed, they can’t learn from or remember their environment.

Finally, there’s intelligent material, which can learn and memorize.

“[It] is able to interact with its environment, learn from the input it receives, and self-regulates its action,” the team wrote.

It starts with four components. The first is a sensor, which captures information from both the outside world and the material’s internal state—think of a temperature sensor on your skin. Next is an actuator, basically something that changes the property of the material. For example, making your skin sweat more as the temperature goes up. The third is a memory unit that can store information long-term and save it as knowledge for the future. Finally, the last is a network—Bluetooth, wireless, or whatnot—that connects each component, similar to nerves in our brains.

“The close interplay between all four functional elements is essential for processing information, which is generated during the entire process of interaction between matter and the environment, to enable learning,” the team said.

How?
Here’s where neuromorphic computing comes in.

“Living organisms, in particular, can be considered as unconventional computing systems,” the authors said. “Programmable and highly interconnected networks are particularly well suited to carrying out these tasks and brain-inspired neuromorphic hardware aims.”

The brain runs on neurons and synapses—the junctions that connect individual neurons into networks. Scientists have tapped into a wide variety of materials to engineer artificial components of the brain connected into networks. Google’s tensor processing unit and IBM’s TrueNorth are both famous examples; they allow computation and memory to occur in the same place, making them especially powerful for running AI algorithms.

But the next step, said the authors, is to distribute these mini brains inside a material while adding sensors and actuators, essentially forming a circuit that mimics the entire human nervous system. For the matter to respond quickly, we may need to tap into other technologies.

One idea is to use light. Chips that operate on optical neural networks can both calculate and operate at the speed of light. Another is to build materials that can reflect on their own decisions, with neural networks that listen and learn. Add to that matter that can physically change its form based on input—like from water to ice—and we may have a library of intelligent matter that could transform multiple industries, especially for autonomous nanobots and life-like prosthetics.

“A wide variety of technological applications of intelligent matter can be foreseen,” the authors said.

Image Credit: ktsdesign / Shutterstock.com Continue reading

Posted in Human Robots

#439235 Video Friday: Intelligent Drone Swarms

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

ICRA 2021 – May 30-5, 2021 – [Online Event]
RoboCup 2021 – June 22-28, 2021 – [Online Event]
RSS 2021 – July 12-16, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
IROS 2021 – September 27-1, 2021 – [Online Event]
ROSCon 20201 – October 21-23, 2021 – New Orleans, LA, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

Drones in swarms (especially large swarms) generally rely on a centralized controller to keep them organized and from crashing into each other. But as swarms get larger and do more stuff, that's something that you can't always rely on, so folks at EPFL are working on a localized inter-drone communication system that can accomplish the same thing.

Predictive control of aerial swarms in cluttered environments, by Enrica Soria, Fabrizio Schiano and Dario Floreano from EPFL, is published this week in Nature.

[ EPFL ]

It takes a talented team of brilliant people to build Roxo, the first FedEx autonomous delivery robot. Watch this video to meet a few of the faces behind the bot–at FedEx Office and at DEKA Research.

Hey has anyone else noticed that the space between the E and the X in the FedEx logo looks kinda like an arrow?

[ FedEx ]

Thanks Fan!

Lingkang Zhang’s latest quadruped, ChiTu, runs ROS on a Raspberrypi 4B. Despite its mostly 3D printed-ness and low-cost servos, it looks to be quite capable.

[ Lingkang Zhang ]

Thanks Lingkang!

Wolfgang-OP is an open-source humanoid platform designed for RoboCup, which means it's very good at falling over and not exploding.

[ Hamburg Bit-Bots ]

Thanks Fan!

NASA’s Perseverance rover has been on the surface of Mars since February of 2021, joining NASA’s Curiosity rover, which has been studying the Red Planet since 2012. Perseverance is now beginning to ramp up its science mission on Mars while preparing to collect samples that will be returned to Earth on a future mission. Curiosity is ready to explore some new Martian terrain. This video provides a mission update from Perseverance Surface Mission Manager Jessica Samuels and Curiosity Deputy Project Scientist Abigail Fraeman.

[ NASA ]

It seems kinda crazy to me that this is the best solution for this problem, but I’m glad it works.

[ JHU LCSR ]

At USC’s Center for Advanced Manufacturing, we have developed a spray painting robot which we used to paint an USC themed Tommy Trojan mural.

[ USC ]

ABB Robotics is driving automation in the construction industry with new robotic automation solutions to address key challenges, including the need for more affordable and environmentally friendly housing and to reduce the environmental impact of construction, amidst a labor and skills shortage.

[ ABB ]

World’s first! Get to know our new avocado packing robot, the Speedpacker, which we have developed in conjunction with the machinery maker Selo. With this innovative robot, we pack avocados ergonomically and efficiently to be an even better partner for our customers and growers.

[ Nature's Pride ]

KUKA robots with high payload capacities were used for medical technology applications for the first time at the turn of the millennium. To this day, robots with payload capacities of up to 500 kilograms are a mainstay of medical robotics.

[ Kuka ]

We present a differential inverse kinematics control framework for task-space trajectory tracking, force regulation, obstacle and singularity avoidance, and pushing an object toward a goal location, with limited sensing and knowledge of the environment.

[ Dynamic Systems Lab ]

Should robots in the real world trust models? I wouldn't!

[ Science Robotics ]

Mark Muhn works together with the US FES CYBATHLON team Cleveland since 2012. For FES cycling he uses surgically implanted, intramuscular electrodes. In the CYBATHLON 2016 and 2020, Mark cycled on the first and the third place, respectively. At the past International IEEE EMBS Conference on Neural Engineering (NER21), he described the importance of user-centered design.

[ Cybathlon ]

This just-posted TEDx talk entitled “Towards the robots of science fiction” from Caltech's Aaron Aames was recorded back in 2019, which I mention only to alleviate any anxiety you might feel seeing so many people maskless indoors.

I don’t know exactly what Aaron was doing at 3:00, but I feel like we’ve all been there with one robot or another.

[ AMBER Lab ]

Are you ready for your close-up? Our newest space-exploring cameras are bringing the universe into an even sharper focus. Imaging experts on our Mars rovers teams will discuss how we get images from millions of miles away to your screens.

[ JPL ]

Some of the world's top universities have entered the DARPA Subterranean Challenge, developing technologies to map, navigate, and search underground environments. Led by CMU's Robotics Institute faculty members Sebastian Scherer and Matt Travers, as well as OSU's Geoff Hollinger, Team Explorer has earned first and second place positions in the first two rounds of competition. They look forward to this third and final year of the challenge, with the competition featuring all the subdomains of tunnel systems, urban underground, and cave networks. Sebastian, Matt, and Geoff discuss and demo some of the exciting technologies under development.

[ Explorer ]

An IFRR Global Robotics Colloquium on “The Future of Robotic Manipulation.”

Research in robotic manipulation has made tremendous progress in recent years. This progress has been brought about by researchers pursuing different, and possibly synergistic approaches. Prominent among them, of course, is deep reinforcement learning. It stands in opposition to more traditional, model-based approaches, which depend on models of geometry, dynamics, and contact. The advent of soft grippers and soft hands has led to substantial success, enabling many new applications of robotic manipulation. Which of these approaches represents the most promising route towards progress? Or should we combine them to push our field forward? How can we close the substantial gap between robotic and human manipulation capabilities? Can we identify and transfer principles of human manipulation to robots? These are some of the questions we will attempt to answer in this exciting panel discussion.

[ IFRR ] Continue reading

Posted in Human Robots

#437386 Scary A.I. more intelligent than you

GPT-3 (Generative Pre-trained Transformer 3), is an artificial intelligence language generator that uses deep learning to produce human-like output. The high quality of its text is very difficult to distinguish from a human’s. Many scientists, researchers and engineers (including Stephen … Continue reading

Posted in Human Robots

#439081 Classify This Robot-Woven Sneaker With ...

For athletes trying to run fast, the right shoe can be essential to achieving peak performance. For athletes trying to run fast as humanly possible, a runner’s shoe can also become a work of individually customized engineering.

This is why Adidas has married 3D printing with robotic automation in a mass-market footwear project it’s called Futurecraft.Strung, expected to be available for purchase as soon as later this year. Using a customized, 3D-printed sole, a Futurecraft.Strung manufacturing robot can place some 2,000 threads from up to 10 different sneaker yarns in one upper section of the shoe.

Skylar Tibbits, founder and co-director of the Self-Assembly Lab and associate professor in MIT's Department of Architecture, says that because of its small scale, footwear has been an area of focus for 3D printing and additive manufacturing, which involves adding material bit by bit.

“There are really interesting complex geometry problems,” he says. “It’s pretty well suited.”

Photo: Adidas

Beginning with a 3D-printed sole, Adidas robots weave together some 2000 threads from up to 10 different sneaker yarns to make one Futurecraft.Strung shoe—expected on the marketplace later this year or sometime in 2022.

Adidas began working on the Futurecraft.Strung project in 2016. Then two years later, Adidas Futurecraft, the company’s innovation incubator, began collaborating with digital design studio Kram/Weisshaar. In less than a year the team built the software and hardware for the upper part of the shoe, called Strung uppers.

“Most 3D printing in the footwear space has been focused on the midsole or outsole, like the bottom of the shoe,” Tibbits explains. But now, he says, Adidas is bringing robotics and a threaded design to the upper part of the shoe. The company bases its Futurecraft.Strung design on high-resolution scans of how runners’ feet move as they travel.

This more flexible design can benefit athletes in multiple sports, according to an Adidas blog post. It will be able to use motion capture of an athlete’s foot and feedback from the athlete to make the design specific to the athlete’s specific gait. Adidas customizes the weaving of the shoe’s “fabric” (really more like an elaborate woven string figure, a cat’s cradle to fit the foot) to achieve a close and comfortable fit, the company says.

What they call their “4D sole” consists of a design combining 3D printing with materials that can change their shape and properties over time. In fact, Tibbits coined the term 4D printing to describe this process in 2013. The company takes customized data from the Adidas Athlete Intelligent Engine to make the shoe, according to Kram/Weisshaar’s website.

Photo: Adidas

Closeup of the weaving process behind a Futurecraft.Strung shoe

“With Strung for the first time, we can program single threads in any direction, where each thread has a different property or strength,” Fionn Corcoran-Tadd, an innovation designer at Adidas’ Futurecraft lab, said in a company video. Each thread serves a purpose, the video noted. “This is like customized string art for your feet,” Tibbits says.

Although the robotics technology the company uses has been around for many years, what Adidas’s robotic weavers can achieve with thread is a matter of elaborate geometry. “It’s more just like a really elegant way to build up material combining robotics and the fibers and yarns into these intricate and complex patterns,” he says.

Robots can of course create patterns with more precision than if someone wound it by hand, as well as rapidly and reliably changing the yarn and color of the fabric pattern. Adidas says it can make a single upper in 45 minutes and a pair of sneakers in 1 hour and 30 minutes. It plans to reduce this time down to minutes in the months ahead, the company said.

An Adidas spokesperson says sneakers incorporating the Futurecraft.Strung uppers design are a prototype, but the company plans to bring a Strung shoe to market in late 2021 or 2022. However, Adidas Futurecraft sneakers are currently available with a 3D-printed midsole.
Adidas plans to continue gathering data from athletes to customize the uppers of sneakers. “We’re building up a library of knowledge and it will get more interesting as we aggregate data of testing and from different athletes and sports,” the Adidas Futurecraft team writes in a blog post. “The more we understand about how data can become design code, the more we can take that and apply it to new Strung textiles. It’s a continuous evolution.” Continue reading

Posted in Human Robots

#439040 Ready for duty: Healthcare robots get ...

Not long after the 1918 Spanish flu pandemic, Czech writer Karel Čapek first introduced the term “robot” to describe artificial people in his 1921 sci-fi play R.U.R. While we have not yet created the highly intelligent humanoid robots imagined by Čapek, the robots most commonly used today are complex systems that work alongside humans, assisting with an ever-expanding set of tasks. Continue reading

Posted in Human Robots