Monday, 15 February 2016

the legendary computing rule is dying, thanks to smartphones Tech2 Mobile



Moore’s law is the observation that, “over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years.” It’s a law that’s held true for over 50 years, but physical limitations and the smartphone age may have finally killed it.
While not really a law as such (gravity is a law), Moore’s law has held true these past 51 years because simple improvements in manufacturing techniques have resulted in the shrinking of transistors on a yearly basis. However, we’re now reaching a physical limit as far as transistors are concerned. Chipmakers like Intel struggled to bring 14nm transistors to the market and have already delayed the introduction of 10nm chips by a year or more.

Too hot to handle
The 14nm transistor size means that you can already measure transistor widths in number of atoms and to shrink the transistor further requires a monumental effort of engineering and precision. It’s an effort that will not necessarily yield worthwhile returns. The smaller the transistor gets, the closer they’ll be packed together and the more the heat generated. This is a problem because more heat means more energy and more energy means that more electrons can “jump the gap”.
What’s the “gap”? Oversimplifying the process considerably, a CPU works by channeling electrons (electricity) through logic gates (AND, OR, XOR, etc.); these gates are made up of transistors. If the transistor is “closed”, the electrons won’t pass through. When a small current is passed through the transistor, the “gate” opens and the electrons may pass.
Heat, on the other hand, can endow electrons with enough energy to jump the proverbial gate, resulting in errors. When you have transistors that are less than a hundred atoms wide and these transistors are packed together by the billions, this is a real possibility.
Another problem with shrinking transistor sizes is that the weird (un)realities ofquantum mechanics rear their ugly heads. Einstein himself labeled it “spooky action at a distance” and we’re not going to dive into that scientific black hole. Suffice to say, it’s a problem that’s not easy to understand, let alone solve.
Smarter, not smaller
As transistors shrank, so did PCs. The Smartphone is now the de-facto computing device and by its very nature, prioritises efficiency, low-power consumption and better heat management above all else. Speed and responsiveness is essential for a great experience and a real challenge considering the aforementioned restrictions. The only way to get around that is with multi-core mobile platforms, but these are very different from traditional desktop designs.
Since Smartphone makers have such tight control over their OS and hardware that they can afford to design computing platforms (SoCs like Snapdragon, for example) with specific hardware modules for each and every task. A motion co-processor to keep track of movement, a dedicated H.264 decoder for video, a dedicated CPU core for the OS, and so on are now found on just about every single phone.
These specific modules take the load off any single CPU core, allowing for devices that are responsive, efficient, and sip battery (relatively speaking). Such platforms are not necessarily constrained by Moore’s law as much as traditional PCs.
Speaking of traditional PCs, Moore’s law hasn’t really been applicable since at least2011. The hardcore aficionados among you might still remember the revelation that was Intel’s Sandy Bridge platform. Four generations of CPU have failed to provide a compelling reason to upgrade the platform. They were faster, yes, but by such a small margin that real world performance remained unaffected.
Even with PCs today, efficiency is king and laptops rule the roost.
Moore’s law is dead, for now
Gordon Moore’s law was more cultural phenomenon than a real law. It had a good run, but after 51 years, physical limitations and changes in our computing habits have finally sounded its death knell.

No comments:

Post a Comment