#431999 Brain-Like Chips Now Beat the Human ...Move over, deep learning. Neuromorphic computing—the next big thing in artificial intelligence—is on fire. Just last week, two studies individually unveiled computer chips modeled after information processing in the human brain. The first, published in Nature Materials, found a perfect solution to deal with unpredictability at synapses—the gap between two neurons that transmit and store information. The second, published in Science Advances, further amped up the system’s computational power, filling synapses with nanoclusters of supermagnetic material to bolster information encoding. The result? Brain-like hardware systems that compute faster—and more efficiently—than the human brain. “Ultimately we want a chip as big as a fingernail to replace one big supercomputer,” said Dr. Jeehwan Kim, who led the first study at MIT in Cambridge, Massachusetts. Experts are hopeful. “The field’s full of hype, and it’s nice to see quality work presented in an objective way,” said Dr. Carver Mead, an engineer at the California Institute of Technology in Pasadena not involved in the work. Software to HardwareThe human brain is the ultimate computational wizard. With roughly 100 billion neurons densely packed into the size of a small football, the brain can deftly handle complex computation at lightning speed using very little energy. AI experts have taken note. The past few years saw brain-inspired algorithms that can identify faces, falsify voices, and play a variety of games at—and often above—human capability. But software is only part of the equation. Our current computers, with their transistors and binary digital systems, aren’t equipped to run these powerful algorithms. That’s where neuromorphic computing comes in. The idea is simple: fabricate a computer chip that mimics the brain at the hardware level. Here, data is both processed and stored within the chip in an analog manner. Each artificial synapse can accumulate and integrate small bits of information from multiple sources and fire only when it reaches a threshold—much like its biological counterpart. Experts believe the speed and efficiency gains will be enormous. For one, the chips will no longer have to transfer data between the central processing unit (CPU) and storage blocks, which wastes both time and energy. For another, like biological neural networks, neuromorphic devices can support neurons that run millions of streams of parallel computation. A “Brain-on-a-chip”Optimism aside, reproducing the biological synapse in hardware form hasn’t been as easy as anticipated. Neuromorphic chips exist in many forms, but often look like a nanoscale metal sandwich. The “bread” pieces are generally made of conductive plates surrounding a switching medium—a conductive material of sorts that acts like the gap in a biological synapse. When a voltage is applied, as in the case of data input, ions move within the switching medium, which then creates conductive streams to stimulate the downstream plate. This change in conductivity mimics the way biological neurons change their “weight,” or the strength of connectivity between two adjacent neurons. But so far, neuromorphic synapses have been rather unpredictable. According to Kim, that’s because the switching medium is often comprised of material that can’t channel ions to exact locations on the downstream plate. “Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way,” explains Kim. “But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects.” In his new study, Kim and colleagues swapped the jelly-like switching medium for silicon, a material with only a single line of defects that acts like a channel to guide ions. The chip starts with a thin wafer of silicon etched with a honeycomb-like pattern. On top is a layer of silicon germanium—something often present in transistors—in the same pattern. This creates a funnel-like dislocation, a kind of Grand Canal that perfectly shuttles ions across the artificial synapse. The researchers then made a neuromorphic chip containing these synapses and shot an electrical zap through them. Incredibly, the synapses’ response varied by only four percent—much higher than any neuromorphic device made with an amorphous switching medium. In a computer simulation, the team built a multi-layer artificial neural network using parameters measured from their device. After tens of thousands of training examples, their neural network correctly recognized samples 95 percent of the time, just 2 percent lower than state-of-the-art software algorithms. The upside? The neuromorphic chip requires much less space than the hardware that runs deep learning algorithms. Forget supercomputers—these chips could one day run complex computations right on our handheld devices. A Magnetic BoostMeanwhile, in Boulder, Colorado, Dr. Michael Schneider at the National Institute of Standards and Technology also realized that the standard switching medium had to go. “There must be a better way to do this, because nature has figured out a better way to do this,” he says. His solution? Nanoclusters of magnetic manganese. Schneider’s chip contained two slices of superconducting electrodes made out of niobium, which channel electricity with no resistance. When researchers applied different magnetic fields to the synapse, they could control the alignment of the manganese “filling.” The switch gave the chip a double boost. For one, by aligning the switching medium, the team could predict the ion flow and boost uniformity. For another, the magnetic manganese itself adds computational power. The chip can now encode data in both the level of electrical input and the direction of the magnetisms without bulking up the synapse. It seriously worked. At one billion times per second, the chips fired several orders of magnitude faster than human neurons. Plus, the chips required just one ten-thousandth of the energy used by their biological counterparts, all the while synthesizing input from nine different sources in an analog manner. The Road AheadThese studies show that we may be nearing a benchmark where artificial synapses match—or even outperform—their human inspiration. But to Dr. Steven Furber, an expert in neuromorphic computing, we still have a ways before the chips go mainstream. Many of the special materials used in these chips require specific temperatures, he says. Magnetic manganese chips, for example, require temperatures around absolute zero to operate, meaning they come with the need for giant cooling tanks filled with liquid helium—obviously not practical for everyday use. Another is scalability. Millions of synapses are necessary before a neuromorphic device can be used to tackle everyday problems such as facial recognition. So far, no deal. But these problems may in fact be a driving force for the entire field. Intense competition could push teams into exploring different ideas and solutions to similar problems, much like these two studies. If so, future chips may come in diverse flavors. Similar to our vast array of deep learning algorithms and operating systems, the computer chips of the future may also vary depending on specific requirements and needs. It is worth developing as many different technological approaches as possible, says Furber, especially as neuroscientists increasingly understand what makes our biological synapses—the ultimate inspiration—so amazingly efficient. Image Credit: arakio / Shutterstock.com
This entry was posted in Human Robots and tagged ai, applied, apply, art, artificial, Artificial intelligence, beat, before, believe, better, big, blocks, both, brain, built, california, central, channel, chip, come, computer, computers, computing, control, current, day, deep, Deep learning, developing, device, different, Digital, dr, energy, engineer, erase, exist, field, first, future, grand, guide, Handle, here, human, inspiration, inspired, institute, intelligence, kind, learning, LED, less, level, line, look, makes, many, material, mit, national, nature, need, network, neuron, new, part, perfect, play, possible, power, practical, Processing, rather, recognition, REPLACE, researchers, response, runs, science, second, see, show, simple, small, solution, solutions, something, Space, special, state, store, study, system, systems, technology, the brain, time, times, top, training, unit, way, ways, week, weight, wizard, work, years. Bookmark the permalink.
|
-
Humanoid Gallery
Popular Searches
Copyright © 2024 Android Humanoid - All Rights Reserved
All trademarks and copyrights owned by their respective owners and are used for illustration only