While Moore's law is rapidly coming to an end; the linked articles indicate that there are still paths forward for continued growth to realize the coming 4th Industrial Revolution, with regards to both the Internet of Things (IoT) and AI; until quantum computing gets up to speed:
http://www.techrepublic.com/article/moores-law-dead-in-2021-heres-what-the-next-revolution-will-mean/Extract: “The news that transistors will essentially stop shrinking after 2021 is a big deal. Being that Moore's Law only accounts for the number of transistors in a circuit, it could very well be meeting its demise as the de facto measure of growth in the computing industry. This lead many to ask where the industry will go after Moore's Law.
Hanselman said he believes there's a bigger question to ask: "What are we doing with this kind of computing power?"
Fitting more transistors on a wafer is a useful measure, he said, but we need to better understand how we are designing the systems that are leveraging those transistors. After 2021, we may no longer be able to increase the number of transistors on a particular die, but the cost will continue to drop, which may be an even bigger catalyst.
"As we look toward technologies like the Internet of Things and various means of dispersed computing, we now start to make it very inexpensive to put an awful lot of processing horsepower into all sorts of things that, today, are cost prohibitive," Hanselman said. "And that, I think, is probably the larger revolution that continues, even though we may taper off the advances that Moore's Law has afforded us for so long."
http://www.extremetech.com/extreme/232342-moores-law-scaling-dead-by-2021-to-be-replaced-by-3d-integrationExtract: “Moore’s law scaling dead by 2021, to be replaced by 3D integration
...
… while the ITRS’ executive summary makes extensive predictions regarding future device frequencies, bandwidths, and operating characteristics at the data center, mobile, and Internet of Everything (the proposed successor to the Internet of Things), it does not attempt to predict the future of conventional desktops and laptops. The closest it comes is predicting that by 2029 the average mobile processor will contain 25 application processors and 303 GPU cores, with a max single-component frequency of 4.7GHz (presumably burst frequency).
The implications of the report are clear: Those who seek significantly improved CPU performance will do best to seek it via new computing architectures, improved multi-threading, or improved memory performance in general — not via improvements to raw clock speed. With Intel stuck in the doldrums when it comes to providing architectural improvements, we wouldn’t hold our breath on this front.
https://www.technologyreview.com/s/601441/moores-law-is-dead-now-what/Extract: “But in a few years technology companies may have to work harder to bring us advanced new use cases for computers. The continual cramming of more silicon transistors onto chips, known as Moore’s Law, has been the feedstock of exuberant innovation in computing. Now it looks to be slowing to a halt.
“We have to ask, is this going to be a problem for areas like mobile devices, data centers, and self-driving cars?” says Thomas Wenisch, an assistant professor at the University of Michigan. “I think yes, but on different timescales.”
…
… Intel pushed back its next transistor technology, with features as small as 10 nanometers, from 2016 to late 2017. The company has also decided to increase the time between future generations (see “Intel Puts the Brakes on Moore’s Law”). And a technology roadmap for Moore’s Law maintained by an industry group, including the world’s largest chip makers, is being scrapped. Intel has suggested silicon transistors can only keep shrinking for another five years.
…
“For the last three years we’ve seen a kind of stagnation,” says Simon. That’s bad news for research programs reliant on supercomputers, such as efforts to understand climate change, ...
Simon says the coming plateau in transistor density will stir more interest in redrawing the basic architecture of computers among supercomputer and data-center designers. Getting rid of certain design features dating from the 1940s could unlock huge efficiency gains (see “Machine Dreams”). Yet taking advantage of those would require rethinking the design of many types of software, and would require programmers to change their habits.
Whatever kind of computer you’re interested in, the key question is whether the creative avenues left open to computing companies can provide similar payoffs to Moore’s Law after it ends, says Neil Thompson, an assistant professor at MIT Sloan School. “We know that those other things matter, but the question is, are they of the same scale?” he says.”
See also:
https://www.technologyreview.com/s/601102/intel-puts-the-brakes-on-moores-law/&
https://www.technologyreview.com/s/601195/a-2-billion-chip-to-accelerate-artificial-intelligence/Extract: “At a company event in San Jose, he said, “For the first time we designed a [graphics-processing] architecture dedicated to accelerating AI and to accelerating deep learning.” Nvidia spent more than $2 billion on R&D to produce the new chip, said Huang. It has a total of 15 billion transistors, roughly three times as many as Nvidia’s previous chips. Huang said an artificial neural network powered by the new chip could learn from incoming data 12 times as fast as was possible using Nvidia's previous best chip.
Deep-learning researchers from Facebook, Microsoft, and other companies that Nvidia granted early access to the new chip said they expect it to accelerate their progress by allowing them to work with larger collections of neurons.
“I think we’re going to be able to go quite a bit larger than we have been able to in the past, like 30 times bigger,” said Bryan Catanzero, who works on deep learning at the Chinese search company Baidu. Increasing the size of neural networks has previously enabled major jumps in the smartness of software. For example, last year Microsoft managed to make software that beats humans at recognizing objects in photos by creating a much larger neural network.
Huang of Nvidia said that the new chip is already in production and that he expects cloud-computing companies to start using it this year. IBM, Dell, and HP are expected to sell it inside servers starting next year.”
See also:
https://www.technologyreview.com/s/540851/microsoft-says-programmable-chips-will-make-ai-software-smarter/&
https://www.technologyreview.com/s/536786/machine-dreams/Regarding Quantum Computing see:
http://www.siliconbeat.com/2016/05/04/moores-law-doubt-eyes-turn-quantum-computing/Extract: “Industry experts from around the world who have been working together for years for forecast technology advances in the tech industry are throwing in the towel. The next version of the International Technology Roadmap for Semiconductors, which is produced jointly by the semiconductor industry associations of the United States, Europe, Japan, South Korea and Taiwan, will be the last, the New York Times reported.
The reason: the industry can no longer count on silicon chip technology advancing like clockwork as Intel co-founder Gordon Moore predicted back in 1965.
“The end of Moore’s Law is what led to this,” Thomas M. Conte, a computer scientist at the Georgia Institute of Technology, told the Times.
Conte and other experts are working on a replacement for the forecasts that will include other types of technologies, including so-called quantum computers.
…
Some experts think there are ways out of the future — just not with silicon-based chips.
Chip experts are already exploring — and in some cases already producing — chips made of materials other than silicon. One material considered promising is graphene, which is a form of carbon.
Another possibility is to ditch the cut-and-dry, black-and-white, binary computer architecture for the weird world of quantum mechanics. Some computer scientists are already pioneering computers that are built around quantum bits, or “qubits.””