Will personal computers continue to get faster, or will there be a plateau in the near future? (With a focus on hardware)

Since the year 1965, computer scientists have been able to forecast the growth of the density of transistors on integrated circuits. This is thanks to the observation that Gordon Moore, co-founder of Intel and lesser-known Fairchild Semiconductor, made on the historical trends of transistor density percentage increase in his 1965 article, ‘Cramming more components onto integrated chips’. He observed that the density of transistors on integrated circuits doubles every two years. This was particularly useful for Intel, the biggest CPU vendor because it allowed them to set research and development targets for the increase in the number of transistors on their integrated circuits. According to the International Technology Roadmap for Semiconductors, however, it is looking increasingly unlikely that computers will continue to obey Moore’s Law beyond the year 2021, and increasingly likely that transistor density will hit a plateau, an opinion supported by Gordon Moore himself, as well as the current CEO of Intel, Brian Krzanich. Because transistor density is thought to be the most important factor in determining the clock speed (speed of operation) of a computer, it is likely that advancements in computer clock speed will slow or even stop in the near future. This is, however, only true if computer scientists are unable to successfully explore and put into practice different methods of creating hardware, or even different types of computers, an example being quantum computers.

With modern silicon transistors as small as 7nm long, the future of Moore’s Law is uncertain, because below this size, transistors stop functioning correctly and the electrons inside these transistors experience quantum tunnelling, meaning they flow freely between logic gates and can switch transistors which are meant to stay in an off state on at random times. The smallest transistors that currently exist are made from carbon nanotubes and molybdenum disulphide, the smallest of which are 1nm long. Using these on integrated circuits would make the circuits perform faster, but this solution to the problem of size is not viable, because it is too expensive, and as of yet there has been no indication that the process of creating these transistors will become cheaper. Furthermore, it is known that even transistors made from carbon nanotubes and molybdenum disulphide have a physical limit to how small they can get. On top of this, the uncertainty surrounding the question of whether smaller, more effective transistors can be made is making Intel reluctant to invest in research. In a series of comments he provided to CNET, based on the speech he gave at the Hot Chips conference, Gordon Moore said:

‘It takes huge amounts to build the fab plants and yet more to pay for the design teams to design new chips. Intel makes these investments, which are in the billions of dollars because they expect to reap way more billions of dollars in profits in the following years. But if there is doubt that those profits will arrive, and possibly if they just doubt they can come up with the necessary silicon improvements, they may not want to make an investment at all. Should a major player like Intel make such a call, that would effectively end Moore’s Law all by itself, because then the various companies that make the super expensive tools for chip production will themselves not make the investments needed to keep Moore’s Law alive.’

This section of his speech shows that whether or not there are physical limits to how small transistors can get, there are also economic limitations, restricting further development of integrated circuits. This reluctance to invest is not only due to the limitations of transistors but also due to a slowdown in growth, which Intel experienced in the first quarter of 2017. In the past, Moore’s Law has also been shown to be unreliable. Originally, in 1965, Moore claimed that if historical trends were to continue, then the density of transistors would double every year, however, this was revised to every 18 months in 1975 at the International Electron Devices Meeting, and still later to 2 years. These facts suggest that Moore’s Law is either wrong, or Moore slightly misinterpreted historical trends, and the percentage growth in density of transistors per annum is getting smaller. Intel itself actually stated in 2015 that the pace of advancement in the field of transistor research had slowed. Brian Krzanich, the CEO of Intel announced in 2015 that:

‘Our cadence is closer to two and a half years than two.’

He also suggested himself that the revision in 1975, of Moore’s Law, was precedent for this slowing down of development, and he also suggested that the next double in transistor density would only come after 3 years. The scepticism which Intel, a company which has in the past almost exclusively devoted itself to the development of integrated circuits and their production, has, is a sign of how unlikely it is that we will see further advances in computer hardware in the near future.

Although hardware seems to be a limiting factor in the advancement of the clock speed of computers, engineers have noted that there are unexplored potential alternatives to the silicon transistors we use today. Silicon is an element of group four on the periodic table, however, it is thought that elements in group three and group five could be better materials for transistors. Of particular interest is phosphorous. A high-performance transistor made from black phosphorous was created by researchers led by Director Young Hee Lee working in the Institute for Basic Science, Center for Integrated Nanostructure Physics at Sungkyunkwan University in South Korea. The transistors made from this black phosphorous were smaller than transistors made from silicon as well as easy to make. A smaller integrated circuit with these transistors was shown to work just as well as a silicon-based integrated circuit, however, the abundance of silicon and its consequent cheapness make black phosphorous, a comparatively expensive material, an unviable competitor. Other future paths for computers include quantum computers. Quantum computers will operate on different hardware as well as software and will have the ability to perform calculations faster than normal computers, as well as performing calculations more complex than those even the best computers of today can perform.  Quantum computing chips can be made from silicon, as well as a variety of other elements and compounds. What makes these computers different from classical computers is the fact that classical computers calculate using binary digits (bits) meaning there are only two values (1 and 0). Quantum computers, on the other hand, use quantum binary digits (qubits) and these bits can exist as both 1 and 0 simultaneously, meaning more than one calculation can be performed at a time. Quantum computers are more than just a concept. The first quantum computer sold, known as the D-wave, was bought by the aerospace, defence and security company Lockheed Martin in 2011. The same company made further purchases from D-wave in 2013 and 2015, and so it appears that the quantum computer did actually function the way it was supposed to. However, the fact the D-Wave one was valued at around 15 million USD suggests it could take a long time for prices to fall enough for it to become a personal computer. Nevertheless, the price will undoubtedly fall. The first digital computer cost 400000 USD (this would have been around 530000 USD today) to build. It was never sold because it had to be abandoned during the Second World War, however, if it had, it would likely have sold for a price far above this. The fact that the average personal computer now costs 633 USD means that the price of quantum computers could also drop, though they will likely still be more expensive than classical computers.

The general consensus, however, is that personal computers will stop improving in the near future, and I agree with this opinion. Not only are classical computers reaching their physical limits, but the economic limitations for both consumers and producers mean that there will be limited if any advances. However, when the price of quantum computers comes down (as the price of classical computers did) although it will not be as affordable as a classical computer, I believe that quantum computers can and will replace our classical personal computers. At the moment it is looking unlikely that the way personal computers work, or the materials and the methods of creating personal computers will change. Research has been undertaken by various institutions for years, and the fact that there have been no commercially suitable, as well as functionally suitable changes to the materials computers are made of suggest that we will be using personal computers with silicon integrated circuits for a long time into the future.