We are approaching a point beyond which Moore’s Law will no longer apply, at least economically.
Moore’s Law predicted the constant exponential evolution of processing power, so that we could do more at the same cost as time goes by – but that is no longer true.
As Moore’s Law approaches its practical limits, the future of computing lies not in cramming more transistors onto a chip but in getting processor architectures to work smarter not harder. Survival will require a focus on breakthroughs that can disrupt markets and create new opportunities.
Computing evolution had stalled – until now.
Computing has advanced steadily since humanity harnessed electrical power and transformed it into something approaching an intelligent control system. This process started with vacuum tubes, then advanced to transistors, and finally to microchips. The advances conformed to Moore's Law, which predicted the doubling of chip performance every 18 months. From an evolutionary standpoint, however, computing technology remains primitive, as today's torrents of data mean little to the systems that process them. However, that is starting to change as we enter an unprecedented stage of cognitive computing. And even if advances in computing generally unfold at less-than-exponential rates from now on, we still anticipate a generational change in the evolution of computing.
Moore’s Law, no more
We are approaching a point beyond which Moore’s Law will no longer apply, at least economically. Demand for computing power and communications bandwidth is expanding each year, yet the industry is now bumping up against 1) the limits to shrinking semiconductors as the physical integrity of chips is threatened, 2) rising production costs that will render further advances economically unviable, and 3) demand impairment as costs increase.
Maths, biology and physics converge
The future of computing will look fundamentally different as mathematics, biology and physics converge. Scaling silicon is no longer the only way to achieve better performance, and the computing landscape is becoming less horizontal and more vertical, with processing using a wider range of specialised chips. Efficiency gains are playing a role in meeting near-term computing demand, but we expect radical – not incremental – breakthroughs in computing to solve increasingly complex sets of problems. The future of computing is a convergence of math with information (the classical 0 and 1 bits), biology with information (neurons in a neural network), and physics with information (quantum qubits).
Disruptive potential unleashed
The opportunities – and threats – are substantial. The end of Moore's Law should not be seen as the end of progress. We are in the midst of a real-world paradigm shift: the final stages of a decades-long transition from the scientific discipline of computing to an array of applied cognitive technologies made more widely available through innovative architectures. These advances have yet to realise their full disruptive potential, but their benefits will lead to greater adoption and investing in them will be crucial for many firms’ long-term prospects.
For more on what is possible in the future of cognitive computing, speak to your Morgan Stanley financial adviser or representative. Plus, more Ideas from Morgan Stanley's thought leaders.