Top (quantum mechanics is unrealistic)
Quantum computers can never correct errors.
(Fig.1) The latest paper published in Nature also showed too high error rates that do Not change the fact that quantum computers are useless forever.
Due to the current dead-end mainstream science, physicists had No choice but to artificially fabricate fictitious scientific targets, one of which is the so-called parallel-world quantum computers ( this last ).
Of course, these fictional scientific targets including quantum computers, artificial intelligence, black holes and extra-dimensional string theory led to No practical application due to their lack of real technological innovation, so the media, academia and even top journals have to hype and spread exaggerated news to make this useless quantum technology look useful or promising, misleasing ordinary people.
The typical examples of this unwanted 'cooperation' between academia (= ex. Nature journal ) and corporations (= ex. Google Sycamore ) artificially creating the hyped mainstream pseudo-science are seen in the current alleged 'milestone' quantum supremacy, fictional wormhole-quantum computer ( this 8th-last paragraph ), and the illusory time-travel perpetual machine called time crystals. ← None of them have been put to practical use.
The recent research paper misleadingly claiming "milestone quantum computer's error correction" published in the latest journal Nature is also one of those over-hyped science topics that is useless with No impact on our daily life technology.
↑ How many 'milestones' or 'steps' are needed to realize such a (illusory) practical quantum computer ? ← If infinite milestones and steps are needed, this eye-catching news is nothing but hype and its research is meaningless, deserving No attention.
Actually, this or this Nature article's 4th paragraph admits its uselessness,
" The improvement (of quantum compupter's error correction ) is still small, the Google researchers admit, and the error rate needs to drop much more. It came down by a little; we need it to come down a lot."
It is known that quantum computers suffer too high error rates that prevent their practical use and scaling-up forever ( this 8th-paragraph, this 4-8th paragraphs ) in addition to their intrinsic defective mechanism relying on fantasy parallel worlds.
One in 100 or 1000 manipulations of each qubit (= quantum bit ) causes a fatal error, which means quantum computer's error rate is as high as 10-2 ~ 10-3 (= 1% ~ 0.1 % ) ( this 7th paragraph, this 3rd-paragraph ), which is too high to be practical ( this p.1-introduction-2nd-paragraph ).
This 5th-paragraph says
"According to Google, most applications would call for error rates as low as 10-15 in comparison, state-of-the-art quantum platforms currently have average error rates that are nearer 10-3 (= too high error rate )."
And today's quantum computers (= still Not computers ) are too big, bulky, though they have only less than 100 qubits that need very big freezers to chill them to almost absolute zero by wasting an extremely large amount of electricity for the operation of their bulky energy-inefficient machines ( this middle ), which is really good for climate change ?
On the other hand, the ordinary practical classical computers or laptops are known to cause almost No errors or far lower error rates (that can be easily corrected, hence classical computer's erro is zero ) than the quantum computers ( this 17th paragraph ).
The classical computers or laptops with more than a billion bits or transistors (= which correspond to bits or qubits of quantum computers ) generate less than 1000 errors during the continuous operations or calculations for a billion hours or 0.1 million years in each 1 mega-bits (= 1 Mbits), which too-low error rate is expressed like 1000 FIT (= which means only 1000 bit-errors or faliures in time for 10 billion hours, this p.3-right, this p.12, this p.15-table2 ).
The practical classical computer can manipulate each bit more than 109 or billion times (= several GHZ ) per second, which is faster than quantum computer's qubit manipulation (= 80 MHz ). ← Faster quantum computer or quantum supremacy is a lie.
It means the error rate of the ordinary practical classical computer is negligibly-low = less than 10-25, which causes only one error in 1025 bit-operations that is far, far better than the useless quantum computer causing one error in a mere 100 or 1000 qubit operations ( this 13th-paragraph ) !
In order to suppress this extremely-high error rate of quantum computers, physicists have to treat some numbers of (physical) qubits as one fictitious (logical) qubit. ← This single fictitious logical qubit consisting of multiple qubits is the quantum compuer's substantial minimum qubit usable for actual calculation.
So we cannot use 72-qubit Google Sycamore or 433-qubit IBM quantum computer as 72-qubit or 433-qubit quantum computer as it is.
Instead, we have to use this entire 72-qubit Google quantum computer as only one (fictitious logical) qubit ! ← stll Not a computer at all
For example, when they treat three (physical) qubits as one (logical) qubit, they have to make all these three qubits "1 (or 0) bit state" like 111 (or 000 ) to express one (fictitious) logical qubit of "1 (or 0 ) bit state."
↑ If one of these three qubits causes errors like "110", they can fix this error by looking at the other two (physical) qubits (= 11 ), and return them to the original qubits like "110" → "111" as one logical qubit (= 1 ) based on majority logic ( this middle ). ← This simple primitive logic is the basis of quantum error correction.
Quantum mechanics imposes the unreasonably inconvenient rule where physicists are Not allowed to measure each qubit's state to correct errors due to the measurement allegedly destroying the quantum superposition or parallel worlds, so they have to do extremely time-consuming manipulations without directly looking at each qubit state. ← impractical
For example, in these three physical qubits treaed as one (fictitious) logical qubit, they check whether two qubits out of these three qubits are the same or not, and judge which physical qubit causes errors, without directly measuring each qubit 0 or 1 state ( this middle ).
In order to build (illusory) practical quantum computers based on these qubits with too high error rates, physicists have to combine as many as 1000 (physical) qubits as one single fictitious logical qubit ( this last-paragraph, this 12th-paragraph ).
And for practical calculation, more than 1000 (fictitious) logical qubits are necessary, so more than a million qubits (= 1000 physical qubits × 1000 logical qubits ) are necessary for realizing the (illusory) future practical computer ( this 4-5th-paragraphs, this 10~16th paragraphs, this 2nd-last-paragraph, this 8th-paragraph ).
This middle says
"Researchers estimate that a quantum computer will need between 1,000 to 100,000 logical qubits to solve computational problems encountered in practice..
"..And to keep errors in check, we will need to represent each logical qubit by 1,000 to 100,000 physical qubits – depending on how well we can control the qubits. All together, we will need a few million qubits to make quantum computers perform useful work that is also reliable."
This 2nd-paragraph says
"the most common estimates are that it will require on the order of 1,000 error corrected qubits to break current encryption systems, and that it will take 1,000 physical qubits to make one error corrected (logical) qubit. So we’ll need an order of 1 million qubits, and current quantum computers are all in the area of 100 qubits. Figuring out how to scale our current quantum computers by 5 orders of magnitude may well be the biggest problem facing researchers, and there’s No solution in sight."
Even the current largest quantum computer of IBM osprey has only 433 qubits that are still far less than the practical quantum computer that should have more than millions of qubits.
And this alleged-largest IBM quantum computer cannot correct errors (= they just put together qubits with No improvement of error correction ), so useless as practical computers.
Because it is said that the current best and largest quantum computer that can allegedly correct errors is Google Sycamore with only 72 qubits (= though it is still too high error rate to be practical ).
Google researchers prepared two types of logical qubits where one logical qubit consists of 49 (physical) qubits (out of 72 Sycamore qubits ), and the other smaller logical qubit consists of 17 qubits.
Basically, as one logical qubit contains more physical qubits, the logical qubit should correct errors more likely (= Two qubit errors caused in a bigger logical qubit consisting of 1000 physical qubits can be easily fixed by looking at the other intact qubits, compared to two-qubit errors caused in the smaller logical qubit consisting of 3 physical qubits ).
But the current quantum computer suffers too many errors, so as the logical qubit becomes bigger containing more physical qubits, its error rate tends to be higher ironically, contrary to the original theory.
Only in this latest research, for the first time, they claim that the error rate of the logical qubit consisting of 49 qubits is slightly lower (= but its error rate is still as high as 2.9 % ! ) than the smaller logical qubit consisting of 17 qubits (= 3.0 % ), which tiny improvement is the alleged first time in quantum computer's history, hence the exaggerated 'milestone' words are rampant in the media (= actually Not a milestone for practical purpose at all, this 10th-paragraph, this 3rd-last paragraph ).
↑ Even in this latest research, the qubit's error rate is still too high (= 3.0 % or 0.03 which generates 3 errors in every 100 manipulations of qubits ) and impractical, in contrast to the practical classical computer's extremely low error rate (= less than 10-25 ).
And 49 qubits of Google Sycamore ( this Fig.1 ) are far less than millions of qubits required for the (illusory) future practical quantum computer with more than 1000 logical qubits.
This 3rd-last-last paragraphs say
"the larger logical qubit (= consiting of 49 qubits ) had an error rate of 2.914 percent versus 3.028 percent in the smaller one (= 17 qubits ). That's Not much of an advantage, but it's the first time any advantage of this sort has been demonstrated. And it has to be emphasized that either error rate is too high to use one of these logical qubits in a complex calculation.."
".. So, to be clear, a 0.11 percent improvement in error correction that requires roughly half of Google's processor to host a single qubit doesn't represent any kind of computational advance. We're not closer to breaking encryption than we were yesterday." ← This latest research does Not change the fact that quantum computers are still useless.
And Google quantum computer could Not fix errors it detected ( this 3rd-paragraph ).
Only this Google quantum computer with less than 50 qubits allegedly could correct errors (= though still too high error rates, hence impractical ), which means the current largest IBM 433-qubit quantum computer is meaningless with No ability to correct their errors (= just increasing numbers of qubits without reducing errors is meaningless ).
See this "why it doesn’t matter (yet, much) for developers ?" section about the error-prone impractical IBM quantum computers.
As a result, this latest research published in top journal Nature is far from "milestone" in terms of practical computers, hence, just one of the hypes flooding the current media and academia.
The fact that (illusory) practical quantum comptuer with millions of qubits are still impossible despite long years of researches across the world means its basic theory = quantum mechanics itself has already come to dead-end, and hampers our real scientific progress forever.
2023/2/25 updated. Feel free to link to this site.