(Fig.1) Errors of today's hopeless quantum computers increase instead of decrease in each quantum error correction (= error-worsening correction )
The 3rd, 9th, 2nd-last paragraphs of this hyped news (11/24/2024) say
"we introduce AlphaQubit, an AI-based decoder that identifies (= just identify, Not correct errors ) quantum computing errors... Google DeepMind's machine learning" ← This decoder is just a classical computer ( this 3rd-paragraph ), because today's error-prone quantum computers alone cannot correct their errors.
"We began by training our model to decode the data from a set of 49 qubits (= just 49 qubits is far from millions of qubits required for a practical quantum computer ) inside a Sycamore quantum processor," ← Today's useless quantum computers cannot correct their errors, so they tried to train the classical computer called decoder ( this abstract, this 3rd-paragraph ) for identifying quantum computer's errors illegitimately.
"it's still too slow to correct errors in a superconducting processor in real time" ← Even relying on classical computer AI also failed to correct error-prone quantum computers after all. ( this lower-the road ahead ). ← quantum error correction was dead.
↑ And relying on the ordinary classical computer means No quantum computer's advantage in this method.
This Google experiment treated just 49 (physical) qubits as one logical qubit expressing only one bit 0 or 1 (= all 49 qubits composing only one logical qubit must always take the same 0 or 1 state, which cannot calculate anything ), so that some (physical) qubits causing errors could be detected from other intact qubits inside one logical qubit consisting of multiple qubits with the same bit state ( this 5th~last-paragraph, this 9th-paragraph, this 12th-paragraph )
↑ This research paper ↓
p.1-left-2nd-paragraph says "the logical error rate needs to be reduced to about 10−12 per logical operation, far below the error rates in today's hardware that are around 10−3 to 10−2 per physical operation" ← Today's quantum computers's error rates are far worse than practicaly-required error rates.
p.1-left-last-paragraph says "a logical qubit is formed by a d × d grid of physical qubits, called data qubits, such that errors can be detected" ← This experiment treated Google's 49 (physical) qubits as just one (logical) qubit expressing only 0 or 1, which (= just 1 bit ) is still Not a computer
p.2-right-5th and last paragraphs say "the logical error per round (LER = error rate in detecting logical qubit's error one time ),... AlphaQubit achieves an LER of .. 2.748 × 10−2 (= 2.748% error rate is too bad ) at distance 5"
↑ It means this Google 49 qubits treated as just one logical qubit increased its error rate by about 2.748% in each error correction (= detection ) round, and they could Not correct errors ( this 6th-paragraph ).
This 2nd-last-paragraph says
"the system remains too slow to correct errors" ← This quantum computer's error correction is impossible after all.
↑ Governments and corporations should stop wasting money on the hopeless useless error-prone quantum computers (= just hypes remain ), and should allocate the money to developing practical multi-probe atomic force microscopes manipulating single atoms and curing diseases.
2024/11/24 updated. Feel free to link to this site.