Moore's Law, the idea that the number of transistors on an integrated circuit doubles once every two years, has been an article of faith for many years now. As we begin to brush up against the limits of research and development into integrated circuit technology, though, the law is expected to run into trouble as a useful predictor.

Research, Development, And Moore’s Law

Moore’s famous prediction is important to R&D because if Moore’s Law reaches its natural limit in the next couple of years, as many think it will, the very nature of computing R&D will change completely. This is what we’re excited by at ForrestBrown, as with a push towards different ways of researching and developing computers, we’ll inevitably see new, exciting R&D startups, hardware startups and research and development on the software side into how to best use these shiny new toys.

Moore’s Law is important more generally because transistor count is a good broad estimate of how powerful computers are, if we take certain things for granted, and because it determines the R&D goals of many chip manufacturers.

Already larger smartphones and tablets and improvements in hardware efficiency are picking up some of the slack as it becomes harder and harder to fit more transistors on a dense integrated circuit. These, together with more exotic areas of computing research and development, look like they will improve the power of all our computers, but in very different ways.

More On Moore’s Law

The concise description “the number of transistors on an integrated circuit doubles once every two years,” benefits from some degree of expansion.

Moore’s Law, when initially observed by Gordon E. Moore, one of the Founders of Intel Corporation, referred to a doubling of transistor count every year. It was later revised to every two years, demonstrating clearly that it was an observation-based prediction subject to refinement and change rather than a strict law.

Although it was based on observation, it has worked well over the years as a predictor of transistor counts. This is at least partly because of market forces and competition within the tech industry, as keeping up with Moore’s Law becomes the benchmark that hardware research and development processes are most closely associated with. Both marketing and engineering departments started to dictate R&D processes to some extent, fully aware that their competitors were striving for much the same goals.

Why is it coming to an end?

The end of Moore's Law

Image by Daniel Rehn

In part, Moore’s Law must come to an end because it is a physical phenomenon governed by the physical limits of the universe. This is the logic behind the high-end estimates of as much as hundreds of years in the future.

More practically for the near future, the expense is becoming greater and research and development into alternative projects is becoming more appealing. Meanwhile, certain limitations of our current technology mean that, barring a stunning breakthrough, costs of further development are going to start really soaring once transistors drop to around fourteen nanometres in length.

Few people if any are predicting that Moore’s Law will continue to apply past five nanometre microchips, with a substantial minority of engineers expressing scepticism about its continued relevance past the ten nanometre stage. However, if it starts to become uneconomical to invest in increasing transistor count, you can expect to see a lot more alternative R&D projects in the works.

What Will Replace It?

There are a couple of paradigms that could become more important to computing with Moore’s Law on the way out.

The first involves Nielsen’s Law, a similar prediction to Moore’s Law but applying to internet bandwidth rather than raw computing power. In a world where bandwidth was the main factor in computing power, research and development would be focussed on cloud computing and increasing the efficiency of large, dedicated server computer clusters serving bare bones client computers.

The second, in which power per watt becomes the all-important factor, sees computers becoming ever more efficient and processes ever more parallelised, with a lot of work outsourced to a distributed internet-of-things. This paradigm could also result in an increased focus on cloud computing, with smaller client computers being made more appealing to consumers by decreased power draw.

Alternatively, here are some of the exotic technologies still being developed that could step into the gap left by Moore’s Law, in approximate order from closest to furthest away.

ASICs

Application Specific Integrated Circuits, or ASICs, are circuits designed to do one single thing very well – as the name implies.

Their use is primarily associated with cryptocurrencies such as Bitcoin, but their success in this area – including secondary industries built entirely around renting ASICs – suggest wider personal adoption of different ASICs could be possible in the future.

3D Chips

There are a couple of promising 3D chip architectures already in place in the forms of TSV and TCI, with 2.5D chip stacks also undergoing refinement and development.

The main usage so far looks to be in increasing the capacity of Solid State Drives, with Samsung’s Jaesoo Han saying “we believe the 3-bit V-NAND will accelerate the transition of data storage devices from hard disk drives to SSDs“.

A move towards more powerful and thus, presumably, more cost-effective SSDs could result in increasing numbers of servers adopting the drive technology. It could also result in an increase in research and development into software that makes good use of the caching capabilities of SSDs.

Quantum Computing

Supercomputer chip

Image courtesy of D-Wave, Creative Commons Attribution 3.0

Quantum computing is one of the best-known “next steps” for computer technology. It could offer a substantial improvement in computing power, but only for very specific processes able to take advantage of the strange ways quantum computers work. The most famous of these is Shor’s algorithm, while the most useful application looks like it will be the rapid simulation of folding proteins – which currently requires vast amounts of classical computing power or crowdsourced problem-solving abilities.

A quantum computer seems as though it would be most impactful as a vehicle for further research and development, enabling mathematician, cryptographers and scientists to make significant technological breakthroughs. It is not as much a speeding up of computers as it is a completely different way of computing.

The most famous quantum computers exist courtesy of D-Wave (example chip pictured above), which overcame early scepticism to form the basis of Google’s drive towards quantum computing.

Biocomputing

Biocomputing is probably the most exotic direction computing could head in.

From primitive approximations of telepathic rats to computers based on chemical feedback loops, the field is young and has not had time to form a particular overall direction.

The most promising route so far is DNA computing, which has two strands of research, so-to-speak.

The first is DNA as memory, which has good early results and looks like it could one day prove practical as low/no-energy long-term storage.

The second is DNA as an actual computing device. In particular, research is focussing on the ability of DNA computers to solve variations on the Travelling Salesman Problem – a good problem to focus on as it is complex enough to take advantage of a DNA computer’s strengths, which seem to take longer to solve simple problems than classical computers do.

The application of DNA computers to the Travelling Salesman Problem is particularly interesting because one of the places where the Travelling Salesman Problem is most relevant is in efficient computer chip design.

Moving Beyond Moore’s Law

As we move beyond Moore’s Law and into the realms of exotic, exciting, specific computing devices the importance of research and development continues to increase.

Similarly, as consumer software begins to be bounded by practical hardware limitations, there are even more opportunities for research and development into efficiency and algorithmic updates.

We’re interested in the deceleration and possible death of Moore’s Law at ForrestBrown because R&D tax credits are our main focus, and a diversification of commercial research beyond decreasing transistor size could be good for business.

More than that, though, we’re just massively excited by the new technologies that could rise from its ashes.

Related posts