Tue, Feb 07, 2017 @ 08:00 AM
For most of the relatively brief history of modern computing, progress has been measured in shrinking by nanometers. By making transistors smaller and smaller, engineers have been able to pack more transistors on smaller chips. More transistors per chip mean faster, more powerful computers that can fit into smaller devices. These microprocessors have made possible the rise of modern consumer electronics, including the PC you’re reading this blog on and the smartphone in your pocket.
More than 40 years ago, Gordon Moore, a co-founder of chip-maker Intel, hypothesized that the number of transistors on microchips would double every year or so — and keep doubling. His theory became known as “Moore’s Law,” and its continued accuracy has depended on science’s ability to keep making smaller, thinner transistors. Now, however, experts generally agree that Moore’s Law is coming to an end.