#437345 Moore’s Law Lives: Intel Says Chips ...If you weren’t already convinced the digital world is taking over, you probably are now. To keep the economy on life support as people stay home to stem the viral tide, we’ve been forced to digitize interactions at scale (for better and worse). Work, school, events, shopping, food, politics. The companies at the center of the digital universe are now powerhouses of the modern era—worth trillions and nearly impossible to avoid in daily life. Six decades ago, this world didn’t exist. A humble microchip in the early 1960s would have boasted a handful of transistors. Now, your laptop or smartphone runs on a chip with billions of transistors. As first described by Moore’s Law, this is possible because the number of transistors on a chip doubled with extreme predictability every two years for decades. But now progress is faltering as the size of transistors approaches physical limits, and the money and time it takes to squeeze a few more onto a chip are growing. There’ve been many predictions that Moore’s Law is, finally, ending. But, perhaps also predictably, the company whose founder coined Moore’s Law begs to differ. In a keynote presentation at this year’s Hot Chips conference, Intel’s chief architect, Raja Koduri, laid out a roadmap to increase transistor density—that is, the number of transistors you can fit on a chip—by a factor of 50. “We firmly believe there is a lot more transistor density to come,” Koduri said. “The vision will play out over time—maybe a decade or more—but it will play out.” Why the optimism? Calling the end of Moore’s Law is a bit of a tradition. As Peter Lee, vice president at Microsoft Research, quipped to The Economist a few years ago, “The number of people predicting the death of Moore’s Law doubles every two years.” To date, prophets of doom have been premature, and though the pace is slowing, the industry continues to dodge death with creative engineering. Koduri believes the trend will continue this decade and outlined the upcoming chip innovations Intel thinks can drive more gains in computing power. Keeping It TraditionalFirst, engineers can further shrink today’s transistors. Fin field effect transistors (or FinFET) first hit the scene in the 2010s and have since pushed chip features past 14 and 10 nanometers (or nodes, as such size checkpoints are called). Korduri said FinFET will again triple chip density before it’s exhausted. The Next GenerationFinFET will hand the torch off to nanowire transistors (also known as gate-all-around transistors). Here’s how they’ll work. A transistor is made up of three basic components: the source, where current is introduced, the gate and channel, where current selectively flows, and the drain. The gate is like a light switch. It controls how much current flows through the channel. A transistor is “on” when the gate allows current to flow, and it’s off when no current flows. The smaller transistors get, the harder it is to control that current. FinFET maintained fine control of current by surrounding the channel with a gate on three sides. Nanowire designs kick that up a notch by surrounding the channel with a gate on four sides (hence, gate-all-around). They’ve been in the works for years and are expected around 2025. Koduri said first-generation nanowire transistors will be followed by stacked nanowire transistors, and together, they’ll quadruple transistor density. Building UpGrowing transistor density won’t only be about shrinking transistors, but also going 3D. This is akin to how skyscrapers increase a city’s population density by adding more usable space on the same patch of land. Along those lines, Intel recently launched its Foveros chip design. Instead of laying a chip’s various “neighborhoods” next to each other in a 2D silicon sprawl, they’ve stacked them on top of each other like a layer cake. Chip stacking isn’t entirely new, but it’s advancing and being applied to general purpose CPUs, like the chips in your phone and laptop. Koduri said 3D chip stacking will quadruple transistor density. A Self-Fulfilling ProphecyThe technologies Koduri outlines are an evolution of the same general technology in use today. That is, we don’t need quantum computing or nanotube transistors to augment or replace silicon chips yet. Rather, as it’s done many times over the years, the chip industry will get creative with the design of its core product to realize gains for another decade. Last year, veteran chip engineer Jim Keller, who at the time was Intel’s head of silicon engineering but has since left the company, told MIT Technology Review there are over a 100 variables driving Moore’s Law (including 3D architectures and new transistor designs). From the standpoint of pure performance, it’s also about how efficiently software uses all those transistors. Keller suggested that with some clever software tweaks “we could get chips that are a hundred times faster in 10 years.” But whether Intel’s vision pans out as planned is far from certain. Intel’s faced challenges recently, taking five years instead of two to move its chips from 14 nanometers to 10 nanometers. After a delay of six months for its 7-nanometer chips, it’s now a year behind schedule and lagging other makers who already offer 7-nanometer chips. This is a key point. Yes, chipmakers continue making progress, but it’s getting harder, more expensive, and timelines are stretching. The question isn’t if Intel and competitors can cram more transistors onto a chip—which, Intel rival TSMC agrees is clearly possible—it’s how long will it take and at what cost? That said, demand for more computing power isn’t going anywhere. Amazon, Microsoft, Alphabet, Apple, and Facebook now make up a whopping 20 percent of the stock market’s total value. By that metric, tech is the most dominant industry in at least 70 years. And new technologies—from artificial intelligence and virtual reality to a proliferation of Internet of Things devices and self-driving cars—will demand better chips. There’s ample motivation to push computing to its bitter limits and beyond. As is often said, Moore’s Law is a self-fulfilling prophecy, and likely whatever comes after it will be too. Image credit: Laura Ockel / Unsplash
This entry was posted in Human Robots and tagged 3d, applied, artificial, Artificial intelligence, before, believe, better, beyond, can, cars, center, channel, chief, chip, city, come, Companies, computing, conference, control, creative, current, daily, death, design, Digital, dodge, drive, economy, end, engineer, engineering, engineers, events, every, evolution, exist, extreme, features, field, first, five, general, generation, going, hand, head, here, home, Industry, intelligence, internet, keep, law, lee, life, Light, lines, making, many, Market, microsoft, microsoft research, mit, modern, money, need, new, Off, people, Performance, phone, play, politics, possible, power, president, product, rather, reality, REPLACE, research, review, runs, school, self, shopping, since, smartphone, software, source, Space, stem, TAKE, taking, tech, technologies, technology, Three, time, times, top, trend, universe, viral, virtual, virtual reality, vision, work, world, Would, year, years. Bookmark the permalink.
|
-
Humanoid Gallery
Popular Searches
Copyright © 2024 Android Humanoid - All Rights Reserved
All trademarks and copyrights owned by their respective owners and are used for illustration only