By the early 2020s, many experts had noticed that the law formulated by Gordon Moore in 1965, which stipulated that processor speeds would double every two years (the period was later amended to 18 months), was no longer working as efficiently as it had previously.
This is because the 3-nanometer transistors available today, and the 2-nanometer ones currently being tested by Samsung and other companies, are approaching the physical limits of how small transistors can be made and placed on chips.
But computer performance and efficiency continue to advance through other technological paths, such as improving processor architecture and improving software and algorithms. To provide higher speeds, there is an increasing need to build supercomputers made up of thousands of processors.
Supercomputers
The giant Frontier ranked first as the fastest computer according to the list of the 500 fastest computers in the world for June 2024, issued by the website “top500.org.” Frontier's speed exceeded 1.1 ExaFLOP (one followed by 18 zeros (billion billion) which is a calculation per second). This supercomputer is located at the Oak Ridge National Research Laboratory in Tennessee, affiliated with the US Department of Energy. The cost of building Frontier amounted to $600 million, and it began installation and operation in 2021, and reached full capacity in 2022.
This supercomputer is used to accomplish many tasks, including advanced scientific research, climate change simulation, molecular biology, genome analysis, and drug design, which helps in developing new treatments and understanding diseases at a deeper level. It is also used in analyzing big data, developing artificial intelligence, and many other uses that require superior computational capabilities.
This supercomputer is equipped with 9,472 central processing units and 37,888 graphics processing units from AMD, a total of 47,360 processors working together to deliver its amazing computational power. These supercomputers are not enough to solve all the computational problems in the era of artificial intelligence, as the traditional computing chip has almost reached its limits, and thus there is an urgent need to explore new computing models that can continue the path of increasing the speed of calculations, and thus work on quantum computing has flourished.
quantum computing
The theoretical foundations of quantum computing began to be laid in 1981 by Richard Feynman, the American Nobel Prize-winning physicist, and then by the 1985 research of the English physicist David Deutsch, where the two laid the foundations of quantum computing. Feynman's idea was that a quantum computer would be effective in simulating a universe that essentially operates according to quantum mechanics.
Traditional computing uses a conventional bit (bit) that can be either 0 or 1, while quantum computing uses a quantum bit, or qubit, that can be in a state of 0, 1, or in a superposition of both (superposition), meaning that it can be 0 and 1 at the same time to different degrees. This is what gives quantum computing its great power, as qubits can perform a large number of calculations at the same time, making them more effective at solving some complex problems compared to traditional computing.
With two qubits, each qubit can be in a superposition state, meaning that the system can represent four states at the same time: 00, 01, 10, and 11. The more qubits there are, the more operations can be performed simultaneously. For example, if we have three qubits, we can perform 2×3 operations, or 8 operations simultaneously. If we have 100 qubits, we can perform 2×100 operations simultaneously (1267 followed by approximately 27 zeros), and so on. This huge number of simultaneous operations reflects the power of quantum computing, but it depends on the ability to exploit these superstates in an efficient way, which requires continuous development of quantum algorithms and error correction.
In 1998, IBM and Stanford University performed the first experimental quantum computation, where the research team applied a two-bit quantum algorithm to distinguish between two classes of quantum states, using nuclear magnetic resonance (NMR) technology.
Quantum bit development
Researchers began experimenting with different systems for creating qubits in the 1990s. As understanding and control of individual qubits improved, efforts shifted to increasing the number of qubits to perform more complex calculations, and companies like IBM, Google, and D-Wave began building more advanced quantum systems.
Among these developments, the use of nuclear magnetic resonance for quantum computing in the late 1990s was an important experimental milestone in turning the concepts of quantum computing into a practical reality.
In October 2019, Google announced that its experimental quantum computer, with a 53-qubit processing unit, had performed a calculation in just over 3 minutes that would have taken the world’s fastest supercomputer about 10,000 years to complete, but IBM has questioned the result.
Google, IBM, Microsoft and other giants are investing heavily in quantum computing technology. IBM announced the launch of a 1,000-qubit quantum chip just before the end of 2023, and has announced its intention to develop a 100,000-qubit quantum chip within the next ten years. China is not far behind in this field, as its scientists recently developed a 504-qubit quantum computing chip that will be made available to researchers around the world via a new cloud platform for quantum computing.
Physical qubits and logical qubits
Physical qubits are the actual quantum units in quantum computing, and are unstable and error-prone due to noise and interference.
Logical qubits are qubits created by encoding a set of physical qubits in order to eliminate errors and make computations more reliable.
The number of physical qubits required to create a single logical qubit depends on the quantum error correction method used and the quality of the quantum computers. The number currently ranges between 100 and 1000 physical qubits, and many experiments are being conducted to reduce the required number.
In April 2024, Microsoft and quantum computing company Quantinuum announced that they had created 4 logical qubits from just 30 physical qubits, running 14,000 experiments without finding a single error. This would be a major technological breakthrough if the two companies were to succeed in using it to produce commercial quantum computers with a capacity of at least 100 logical qubits.
IBM plans to create 12 logical qubits from 244 physical qubits by 2026. Google has conducted several experiments to create a single logical qubit from different numbers of physical qubits, and demonstrated that a logical qubit made up of 105 physical qubits suppressed errors more effectively than a logical qubit made up of 72 physical qubits, confirming that putting increasing numbers of physical qubits together into a single logical qubit can reduce errors.
By the end of 2023, the highest number of logical qubits in a quantum computer was achieved, 48 logical qubits, in experiments in which Harvard University, QuEra Computing, MIT, and the American National Standards Institute collaborated, allowing for more stable and reliable quantum calculations.
This advance is an important step toward developing scalable, error-tolerant quantum computers, which are essential for large-scale, practical quantum applications.
Simulation limit
The speeds of conventional supercomputers currently mimic the equivalent of 50 logical qubits in quantum computers, so the goal today is to develop quantum computers with a capacity of 100 logical qubits (about 10,000 physical qubits with current technologies). Thus, computing enters a new era where quantum computers exceed the computational capabilities of conventional supercomputers, and move from theoretical or small experiments to influential practical applications.
QEra Computing plans to launch several quantum computers in the coming years, starting with a computer with 30 logical qubits out of 3,000 physical qubits in 2025. And a computer with 100 logical qubits out of 10,000 physical qubits in 2026, which would be more powerful than today’s supercomputers.
Benefits of quantum computing
Quantum computers are not designed to replace conventional computers, but rather to complement them, and to address problems that require high speeds that conventional computing cannot achieve. These computers are expected to represent a large-scale revolution in the field of drug development, energy exploration, financial analysis, and weather forecasting. They will also contribute to accelerating the development of generative artificial intelligence, as it requires superior capabilities to analyze huge amounts of information, which is what quantum computers excel at.
The threat posed by quantum computing
Traditional computing would take thousands of years to crack the encryption currently in use and essential for Internet security, while advanced quantum computing would take only a few seconds or minutes to do so, which poses a serious risk to all businesses and individuals’ privacy online. However, cracking RSA-2048 requires 4,000 logical qubits, or about 400,000 physical qubits with current technologies, which is not available today, but could potentially be achieved within a few years.
Since 2016, the US National Institute of Standards and Technology has been working on developing encryption algorithms suitable for the quantum computing stage, and a preliminary draft of the newly required standards has been developed, and it is expected that Internet service providers will integrate these algorithms into their systems in the future.
Quantum winter
Despite significant progress, quantum computing still faces significant technical challenges, such as quantum stability and errors caused by quantum noise. Addressing these challenges requires developing new algorithms and building more stable quantum computers.
This has prompted some experts to warn of a potential quantum winter, that is, failure to find radical solutions to these problems will lead to a decline in interest in quantum computing for a number of years. In fact, Russian computer scientist Mikhail Dyakonov believes that we may never succeed in developing a practical quantum computer. In 2019, he wrote an editorial in which he declared that the future of quantum computing had reached a dead end. He said, “Scientists will never overcome the problems of noise, scalability, and efficiency necessary to give quantum computers a useful advantage over classical computers.”
The previous pessimistic views do not reflect the reality of advanced quantum computing research, as dozens of teams around the world are currently working on solutions to the problems associated with developing useful quantum computers.