Top (quantum mechanics is unrealistic ← 6/15/2025 )
Quantum computers can never correct errors.
(Fig.1) The latest paper published in Nature also showed too high error rates that do Not change the fact that quantum computers are useless forever.
Due to the current dead-end mainstream science, physicists had No choice but to artificially fabricate fictitious scientific targets, one of which is the so-called parallel-world quantum computers ( this last ).
Of course, these fictional scientific targets including quantum computers, artificial intelligence, black holes and extra-dimensional string theory led to No practical application due to their lack of real technological innovation, so the media, academia and even top journals have to hype and spread exaggerated news to make this useless quantum technology look useful or promising, misleasing ordinary people.
The typical examples of this unwanted 'cooperation' between academia (= ex. Nature journal ) and corporations (= ex. Google Sycamore ) artificially creating the hyped mainstream pseudo-science are seen in the current alleged 'milestone' quantum supremacy, fictional wormhole-quantum computer ( this 8th-last paragraph ), and the illusory time-travel perpetual machine called time crystals. ← None of them have been put to practical use.
The recent research paper misleadingly claiming "milestone quantum computer's error correction" published in the latest journal Nature is also one of those over-hyped science topics that is useless with No impact on our daily life technology.
↑ How many 'milestones' or 'steps' are needed to realize such a (illusory) practical quantum computer ? ← If infinite milestones and steps are needed, this eye-catching news is nothing but hype and its research is meaningless, deserving No attention.
Actually, this or this Nature article's 4th paragraph admits its uselessness,
" The improvement (of quantum compupter's error correction ) is still small, the Google researchers admit, and the error rate needs to drop much more. It came down by a little; we need it to come down a lot."
It is known that quantum computers suffer too high error rates that prevent their practical use and scaling-up forever ( this 8th-paragraph, this 4-8th paragraphs ) in addition to their intrinsic defective mechanism relying on fantasy parallel worlds.
One in 100 or 1000 manipulations of each qubit (= quantum bit ) causes a fatal error, which means quantum computer's error rate is as high as 10-2 ~ 10-3 (= 1% ~ 0.1 % ) ( this 7th paragraph, this 3rd-paragraph ), which is too high to be practical ( this p.1-introduction-2nd-paragraph ).
This 5th-paragraph says
"According to Google, most applications would call for error rates as low as 10-15 in comparison, state-of-the-art quantum platforms currently have average error rates that are nearer 10-3 (= too high error rate )."
And today's quantum computers (= still Not computers ) are too big, bulky, though they have only less than 100 qubits that need very big freezers to chill them to almost absolute zero by wasting an extremely large amount of electricity for the operation of their bulky energy-inefficient machines ( this middle ), which is really good for climate change ?
On the other hand, the ordinary practical classical computers or laptops are known to cause almost No errors or far lower error rates (that can be easily corrected, hence classical computer's erro is zero ) than the quantum computers ( this 17th paragraph ).
The classical computers or laptops with more than a billion bits or transistors (= which correspond to bits or qubits of quantum computers ) generate less than 1000 errors during the continuous operations or calculations for a billion hours or 0.1 million years in each 1 mega-bits (= 1 Mbits), which too-low error rate is expressed like 1000 FIT (= which means only 1000 bit-errors or faliures in time for 10 billion hours, this p.3-right, this p.12, this p.15-table2 ).
The practical classical computer can manipulate each bit more than 109 or billion times (= several GHZ ) per second, which is faster than quantum computer's qubit manipulation (= 80 MHz ). ← Faster quantum computer or quantum supremacy is a lie.
It means the error rate of the ordinary practical classical computer is negligibly-low = less than 10-25, which causes only one error in 1025 bit-operations that is far, far better than the useless quantum computer causing one error in a mere 100 or 1000 qubit operations ( this 13th-paragraph ) !
In order to suppress this extremely-high error rate of quantum computers, physicists have to treat some numbers of (physical) qubits as one fictitious (logical) qubit. ← This single fictitious logical qubit consisting of multiple qubits is the quantum compuer's substantial minimum qubit usable for actual calculation.
So we cannot use 72-qubit Google Sycamore or 433-qubit IBM quantum computer as 72-qubit or 433-qubit quantum computer as it is.
Instead, we have to use this entire 72-qubit Google quantum computer as only one (fictitious logical) qubit ! ← stll Not a computer at all
For example, when they treat three (physical) qubits as one (logical) qubit, they have to make all these three qubits "1 (or 0) bit state" like 111 (or 000 ) to express one (fictitious) logical qubit of "1 (or 0 ) bit state."
↑ If one of these three qubits causes errors like "110", they can fix this error by looking at the other two (physical) qubits (= 11 ), and return them to the original qubits like "110" → "111" as one logical qubit (= 1 ) based on majority logic ( this middle ). ← This simple primitive logic is the basis of quantum error correction.
Quantum mechanics imposes the unreasonably inconvenient rule where physicists are Not allowed to measure each qubit's state to correct errors due to the measurement allegedly destroying the quantum superposition or parallel worlds, so they have to do extremely time-consuming manipulations without directly looking at each qubit state. ← impractical
For example, in these three physical qubits treaed as one (fictitious) logical qubit, they check whether two qubits out of these three qubits are the same or not, and judge which physical qubit causes errors, without directly measuring each qubit 0 or 1 state ( this middle ).
In order to build (illusory) practical quantum computers based on these qubits with too high error rates, physicists have to combine as many as 1000 (physical) qubits as one single fictitious logical qubit ( this last-paragraph, this 12th-paragraph ).
And for practical calculation, more than 1000 (fictitious) logical qubits are necessary, so more than a million qubits (= 1000 physical qubits × 1000 logical qubits ) are necessary for realizing the (illusory) future practical computer ( this 4-5th-paragraphs, this 10~16th paragraphs, this 2nd-last-paragraph, this 8th-paragraph ).
This middle says
"Researchers estimate that a quantum computer will need between 1,000 to 100,000 logical qubits to solve computational problems encountered in practice..
"..And to keep errors in check, we will need to represent each logical qubit by 1,000 to 100,000 physical qubits – depending on how well we can control the qubits. All together, we will need a few million qubits to make quantum computers perform useful work that is also reliable."
This 2nd-paragraph says
"the most common estimates are that it will require on the order of 1,000 error corrected qubits to break current encryption systems, and that it will take 1,000 physical qubits to make one error corrected (logical) qubit. So we’ll need an order of 1 million qubits, and current quantum computers are all in the area of 100 qubits. Figuring out how to scale our current quantum computers by 5 orders of magnitude may well be the biggest problem facing researchers, and there’s No solution in sight."
Even the current largest quantum computer of IBM osprey has only 433 qubits that are still far less than the practical quantum computer that should have more than millions of qubits.
And this alleged-largest IBM quantum computer cannot correct errors (= they just put together qubits with No improvement of error correction ), so useless as practical computers.
Because it is said that the current best and largest quantum computer that can allegedly correct errors is Google Sycamore with only 72 qubits (= though it is still too high error rate to be practical ).
Google researchers prepared two types of logical qubits where one logical qubit consists of 49 (physical) qubits (out of 72 Sycamore qubits ), and the other smaller logical qubit consists of 17 qubits.
Basically, as one logical qubit contains more physical qubits, the logical qubit should correct errors more likely (= Two qubit errors caused in a bigger logical qubit consisting of 1000 physical qubits can be easily fixed by looking at the other intact qubits, compared to two-qubit errors caused in the smaller logical qubit consisting of 3 physical qubits ).
But the current quantum computer suffers too many errors, so as the logical qubit becomes bigger containing more physical qubits, its error rate tends to be higher ironically, contrary to the original theory.
Only in this latest research, for the first time, they claim that the error rate of the logical qubit consisting of 49 qubits is slightly lower (= but its error rate is still as high as 2.9 % ! ) than the smaller logical qubit consisting of 17 qubits (= 3.0 % ), which tiny improvement is the alleged first time in quantum computer's history, hence the exaggerated 'milestone' words are rampant in the media (= actually Not a milestone for practical purpose at all, this 10th-paragraph, this 3rd-last paragraph ).
↑ Even in this latest research, the qubit's error rate is still too high (= 3.0 % or 0.03 which generates 3 errors in every 100 manipulations of qubits ) and impractical, in contrast to the practical classical computer's extremely low error rate (= less than 10-25 ).
And 49 qubits of Google Sycamore ( this Fig.1 ) are far less than millions of qubits required for the (illusory) future practical quantum computer with more than 1000 logical qubits.
This 3rd-last-last paragraphs say
"the larger logical qubit (= consiting of 49 qubits ) had an error rate of 2.914 percent versus 3.028 percent in the smaller one (= 17 qubits ). That's Not much of an advantage, but it's the first time any advantage of this sort has been demonstrated. And it has to be emphasized that either error rate is too high to use one of these logical qubits in a complex calculation.."
".. So, to be clear, a 0.11 percent improvement in error correction that requires roughly half of Google's processor to host a single qubit doesn't represent any kind of computational advance. We're not closer to breaking encryption than we were yesterday." ← This latest research does Not change the fact that quantum computers are still useless.
And Google quantum computer could Not fix errors it detected ( this 3rd-paragraph ).
Only this Google quantum computer with less than 50 qubits allegedly could correct errors (= though still too high error rates, hence impractical ), which means the current largest IBM 433-qubit quantum computer is meaningless with No ability to correct their errors (= just increasing numbers of qubits without reducing errors is meaningless ).
See this "why it doesn’t matter (yet, much) for developers ?" section about the error-prone impractical IBM quantum computers.
As a result, this latest research published in top journal Nature is far from "milestone" in terms of practical computers, hence, just one of the hypes flooding the current media and academia.
The fact that (illusory) practical quantum comptuer with millions of qubits are still impossible despite long years of researches across the world means its basic theory = quantum mechanics itself has already come to dead-end, and hampers our real scientific progress forever.
(Fig.2) Overhyped fake news tries to hide the hopeless quantum computer's error correction ↓
This overhyped headline (6/20/2025) baselessly says
"Reliable quantum computing is here: Novel approach to error-correction can reduce errors in future (= still unrealized ) systems up to 1,000 times,"
↑ This latest Microsoft's overhyped research is just about 'pie-in-the-sky' theory without experimental realization, as this-2nd-paragraph says " it hasn't shown that it works using actual hardware yet ( this-lower-Limitation )"
↑ These recent overhyped Microsoft and IBM (6/10/2025) researches are just imaginary theories simulated by classical computer's decoders (= BP-OSD decoder, this-p.15, this-p.6-left-V, this-p.59-p.62-IBM's classical computer's CPLEX was used ) without using their error-prone useless quantum computers.
This-3rd-paragraph says
"Among the quantum error correction resources being developed are classical algorithms that identify errors that occur during quantum computation. These classical algorithms are known as decoders."
↑ In the recent Microsoft-atom-compuing used only two error-prone neutral atomic qubits, as this-p.4-right-2nd-paragraph (& Fig.2, 6/11/2025 ) says
"generate the Bell state (|00⟩ + |11⟩) (= just two qubits or two bitstring is 00 or 11 bit state, which is still Not a quantum computer )... We observe an average Bell state fidelity of
98.8(1)% for both preparations. These Bell state fidelities are post-selected on atom survival"
The last paragraph of this overhyped news (6/20/2025) says
"the era of quantum computing is not as far off as we imagine. Whether one calls it magic or physics, this technique certainly marks an important step toward (= still unrealized ) the development of larger-scale quantum computers that can withstand noise"
↑ This-research paper-p.7-V numerical simulation says this research is just classical computer's (= numerical ) simulation of just 20 qubits with some noise model without using the current error-prone useless quantum computers.
The 6th-paragraph of this hyped news (4/30/2025) says
"There is still much work to be done before the architecture could be used in a real quantum computer, but demonstrating the fundamental physics behind the process is a major step in the right direction (= still useless ),"
↑ This recent MIT research used just 2 bulky superconducting qubits (= one qubit can take 0 or 1 value, so still Not a quantum computer, this-p.2-Fig.1, p.6-left-matter-matter nonlinear coupling ) with impracticall high error rate of ~1% ( this-p.5-table I ).
The 1st and last paragraph of this overhyped news (6/25/2025) say
"Scientists have built a compact physical qubit (= just one single qubit 0 or 1, still Not a computer ) with built-in error correction (= false )"
"Nord Quantique plans to release a 100-logical-qubit machine by 2029" ← baseless speculation that will never be realized.
↑ The 3rd-paragaph of this latest research by
Nord Quantique says
"post-selection was used to filter out imperfect runs, and 12.6% of the data was discarded each round. "
↑ As shown in this research paper-middle Fig.4 and Fig.5, their one (logical) qubit's (= just one qubit 0 or 1, still Not a computer ) error rate increased per (fake) quantum error correction (= numbers QEC rounds, this-(c) = so their qubit's error could Not be corrected ).
↑ So they had to discard erroneous qubit's results post-selectively instead of correcting their errors (= because today's impractical quantum error correction operation itself increases errors instead of decreasing ), hence, the qubit's survival probability (= after discarding ) decreased to zero (= meaning its qubit's error rate reached 100% ) after just 32 QEC rounds ( this-(b) ), which is completely useless, cannot be scaled up.
↑ Despite these incredible amount of overhyped fake news, quantum computer's error correction is impossible, as the 3rd paragraph of this recent news (6/25/2025) says
"Quantum computing has the potential (= baseless speculation, still useless ) to allow calculations that are not possible with classical computers,
But qubits are finicky, meaning that errors accumulate during calculations"
↑ Still quantum computer's error correction is impossible forever.
Feel free to link to this site.