Moore's Law
Moore's Law predicts that the number of transistors on a computer chip doubles roughly every two years, driving exponential improvements in computing power.
The prophecy that computing power doubles every two years—and why it's running out of steam.
Plausibility Index: 4.1/5 — Strong Foundation
Held remarkably true for 50+ years but is now hitting fundamental physical limits as transistors approach atomic scales.
The quick version
In 1965, Intel co-founder Gordon Moore noticed that computer chips were getting twice as powerful every couple of years, and boldly predicted this trend would continue. For over five decades, the tech industry has treated this observation as both prophecy and marching orders, reshaping everything from smartphones to space exploration.
Origin story
In April 1965, Gordon Moore was just another engineer at Fairchild Semiconductor when Electronics Magazine asked him to predict the future of computing. Moore, who would later co-found Intel, did something remarkable: he grabbed a ruler, plotted the number of transistors on chips from 1959 to 1965, drew a straight line on logarithmic paper, and extrapolated forward. The line suggested that computing power would double every year for at least the next decade.
Moore's prediction wasn't just an observation—it was a challenge. He essentially dared the semiconductor industry to keep up with his exponential curve. What started as a simple article became the tech world's most influential self-fulfilling prophecy. In 1975, Moore revised his timeline to doubling every two years, the version most people know today.
The timing was perfect. Moore's Law emerged just as Silicon Valley was finding its identity, providing a north star for an entire industry. It gave engineers a target, investors a roadmap, and consumers a promise: your computer will be twice as fast in two years, so maybe hold off on that purchase.
What Moore couldn't have predicted was how his simple observation would become the metronome of the digital age, driving everything from product development cycles to stock market valuations. Companies began planning their entire strategies around Moore's relentless timeline.
How it works
Moore's Law works through a combination of physics, economics, and psychology that creates a virtuous cycle of innovation. At its core, it's about shrinking transistors—the tiny switches that process information in computer chips. Every two years, engineers figure out how to make these switches roughly 30% smaller in each dimension, allowing them to pack twice as many onto the same chip area.
Think of it like a city growing denser. Instead of building outward, chip designers build 'up' and 'in,' cramming more transistors into the same space. In 1971, Intel's first microprocessor had 2,300 transistors. Today's chips contain over 100 billion—that's more transistors than there are stars in the Milky Way, all working in perfect harmony.
The magic happens through a process called photolithography, which is essentially like printing newspapers but with light beams that are thousands of times thinner than human hair. Engineers use these beams to etch patterns onto silicon wafers with mind-boggling precision. Modern chip factories can create features just 3 nanometers wide—so small that only about 15 atoms fit across.
But Moore's Law isn't just about physics; it's about economics and expectations. The semiconductor industry has invested hundreds of billions of dollars in research and manufacturing, all betting that Moore's prediction will hold true. This creates enormous pressure to innovate, turning Moore's Law into a self-fulfilling prophecy that has driven five decades of exponential progress.
Real-world examples
The iPhone Revolution
When Steve Jobs unveiled the first iPhone in 2007, it contained roughly the same computing power as a desktop computer from the late 1990s—but in your pocket. Moore's Law made this possible by shrinking the processors that once filled entire rooms down to chips smaller than a postage stamp. Each new iPhone generation packs in roughly twice the transistors of its predecessor, enabling features like real-time photo processing, augmented reality, and AI-powered voice recognition that would have been science fiction just years earlier.
Netflix vs. Blockbuster
Netflix's victory over Blockbuster wasn't just about business model innovation—it was powered by Moore's Law. As internet speeds increased and storage became cheaper (both following Moore's Law-like exponential curves), streaming became viable. The computing power needed to compress, transmit, and decode high-definition video in real-time was unimaginable in Blockbuster's heyday but became routine thanks to exponentially improving processors.
The $100 Genome
In 2003, sequencing a single human genome cost $3 billion and took 13 years. Today, it costs under $1,000 and takes a few hours. This dramatic improvement follows Moore's Law principles applied to DNA sequencing technology. Each generation of sequencing machines processes genetic information exponentially faster, transforming medicine from one-size-fits-all treatments to personalized therapies based on your unique genetic code.
Criticisms and limitations
Moore's Law is running into the hard wall of physics, and that wall is the atom itself. As transistors shrink to just a few atoms wide, quantum effects start causing havoc. Electrons begin 'tunneling' through barriers they should never cross, causing chips to leak power and behave unpredictably. We're essentially trying to build machines out of components that are too small to behave according to classical physics.
The economics are also becoming prohibitive. Building a state-of-the-art chip factory now costs over $20 billion—more than many countries' GDP. Each new generation of manufacturing equipment is exponentially more expensive, and only a handful of companies can afford to stay on the cutting edge. This has created a semiconductor oligopoly where just a few firms control the world's most advanced chip production.
Critics argue that Moore's Law has become a tyrannical master, forcing companies to prioritize raw speed over other improvements like energy efficiency, reliability, or security. This relentless focus on 'faster, smaller, more' has led to planned obsolescence, where perfectly functional devices become 'slow' simply because newer ones exist. It's also contributed to our throwaway culture of constantly upgrading electronics.
Perhaps most importantly, Moore's Law may have outlived its usefulness. Many computing tasks don't need exponentially more transistors—they need smarter software, better algorithms, or different approaches entirely. The future of computing may lie not in cramming more switches onto chips, but in quantum computing, biological computing, or entirely new paradigms we haven't invented yet.
Related theories
Metcalfe's Law
Describes how network value grows exponentially with users, complementing Moore's Law's hardware exponentials.
The Innovator's Dilemma
Explains why established chip companies struggle when Moore's Law shifts to new technologies.
S-Curve Theory
Suggests Moore's Law represents multiple overlapping S-curves of technological improvement rather than one continuous exponential.
Go deeper
Moore's Law: The Future of the Computer Industry by Chris Mack (2016) — Technical but accessible deep dive into the physics and economics behind the law.
The Innovators by Walter Isaacson (2014) — Chronicles Moore and the other pioneers who created the digital age.
Cramming More Components onto Integrated Circuits by Gordon Moore (1965) — The original Electronics Magazine article that started it all.
Footnotes
- Moore originally predicted doubling every year, then revised to every two years in 1975.
- The semiconductor industry spends over $70 billion annually on R&D to maintain Moore's Law.
- Some experts predict Moore's Law will end by 2025, while others believe new technologies will extend it indefinitely.