Quantum computer is hopeless, useless forever,  only hypes dominate news.

Top

Contents ↓

Quantum computer with a small number of error-prone qubits is still Not a computer. No supremacy.

Quantum computer is already dead, impractical forever. Only hypes remain.

(Fig.Q) Quantum computers that can Not give right answers due to their small numbers of qubits and large numbers of errors are far inferior to ordinary practical classical computers.

Quantum computer is impossible to realize forever.  Only overhyped news is rampant.

Contrary to overhyped news, quantum computers are useless forever.

Quantum computer is still Not a computer or calculator at all.

To hide this inconvenient fact of the already-deadend quantum computer, an incredible amount of overhyped fake news is spread every day.

Each quantum bit or qubit is said to take 0 and 1 states simultaneously using quantum superposition or parallel worlds, which have No evidence.

The ordinary (classical) computer consists of many bits (= transistors ), and each bit can take 0 or 1 state.

Quantum computer uses two energy levels of each atom, ion or artificial atom (= superconducting circuit ) as a quantum bit or qubit (= each atomic qubit's ground state = 0, and excited state = 1 ).

This qubit is said to take 0 and 1 states simultaneously using quantum superposition, a dead-and-alive cat or (fictional) parallel worlds.

But physicists can observe only one single state 0 or 1 in one single world in quantum computer bits or qubits ( this 4~5th-paragraphs ).

So there is No evidence that a qubit can take 0 or 1 states using the magical quantum superposition or parallel universes.

The present quantum computer had only a small number of qubits (= less than 1000 qubits ), which are far from millions of qubits required for a practical computer.

The quantum computer is said to use a quantum bit or qubit, which typically consists of an atom or ion whose two energy levels (= ground state = 0,  excited state = 1 ) are used as each bit's state.

The ordinary practical (classical) computers such as laptops have billions of bits (= each bit can take 0 or 1 ) or transistors.

But the present quantum computers has only very small numbers of bits or qubits.  ← Most of quantum computers have only 1 ~ 50 qubits, which are completely useless, Not a computer at all ( this 1st-paragraph,  this 6th-paragraph ).

The practical quantum computer is said to need more than millions of qubits ( this summary,  this 2nd-paragraph,  this 4th-paragraph,  this 11th-paragraphs ).

Even the current largest quantum computer of IBM has only 1000 error-prone qubits, which are far from millions of qubits required for practical computers.

All the present quantum computers are impractical, too error-prone to give right answers.  So quantum supremacy is fake.

The biggest problem that makes it impossible to put quantum computers to practical use is a high error rate or noise.

The ordinary classical computer or a laptop has a extremely-low error rate of only 10-17 ~ 10-27, which means only one error in 1017 bit operations happens, and these errors can be easily corrected.

So the current (classical) computers used in our daily life have No errors.

On the other hand, all the current quantum computers have too high error rates of 10-3 (= 0.1% ) ~ 10-2 (= 1%) in each qubit operation, which means one error occurs in every 100~1000 operations ( this 2nd-paragraph ), which is far more error-prone than the practical classical computers ( this p.1-left-last-paragraph,  this 2nd-paragraph ).

Furthermore, the error correction of quantum computers is impossible (forever).

So all the current quantum computers are unable to give right answers when they try to calculate some values, which means quantum supremacy or advantage is fake.

The practical quantum computer is said to need much lower error rate of only 10-15 ( this 5th-paragraph,  this 3rd-paragraph,  this-abstract ), which is impossible to achieve.

 

D-Wave annealing machines allegedly used for logistics or optimization problem are fake quantum computers with No quantum speed-up nor utility.

Fake quantum advantage in quantum annealing or optimization problem (= finding the lowest energy ) is caused by deliberately choosing bad classical methods, so No quantum speed-up.

(Fig.D)  D-Wave quantum annealing machines are slower and worse than classical computers with good algorithm (= such as classical Selby ) in optimization problems, so No quantum computer's advantage.  Only quantum hypes remain.

D-Wave annealing machine for optimization problem is Not a quantum computer, and Not faster.

Contrary to many hypes, D-Wave annealing machines for optimization problems (= finding the lowest energy as the final solution,   this p.4 ) are Not real quantum computers.

So D-Wave and other quantum simulators for optimization problems are useless, unable to show quantum speed-up or advantage ( this 8th-paragraph,  this p.2-right-1st-paragraph ).

See also this p.1-last,  this p.1-left-introduction-1st-paragraph,  this 6th-paragraph.

This p.2-2nd-paragraph says
"While the question of whether QA (= quantum annealing ) can provide a quantum speed-up over classical approaches is still subject of a scientific debate.. to the best of our knowledge, there is No industrial application where QA unquestionably outperforms classical heuristic algorithms."

D-Wave annealing machines use classical circuits controlled by magnetic field.  ← No quantum mechanics.

D-Wave annealing machines use (classical) superconducting circuits called flux qubit whose electric current is controlled by external magnetic field ( this middle-flux qubit ).

In this flux qubit (= quantum bit ), direction of electric current ( clockwise = 0,  anti-clockwise = 1 ) represents each qubit's 0 or 1 states.

↑ No quantum mechanics is used.
Just the quantized de Broglie wave influences flux qubits.

D-Wave annealing machines showed No quantum speed-up or advantage in optimization problem such as travelling salesman, logistics, traffic, scheduling.

It is said that D-Wave annealing machines are good at optimization problems finding the lowest energy state or the shortest route as solutions.

This optimization problem includes finding the shortest route or way in traveling salesman problem, logistics, traffic flow and scheduling.

But in fact, D-Wave annealing machines were useless, showing No quantum advantage or speed-up in this optimization, traveling salesman problem, or logistics ( this 2nd-last-paragraph,  this p.1-abstract,  this abstract-lower,  this p.1-intro-2nd-paragraph ).

This 5th-last-paragraph says "It is still not clear how to prove quantum advantage in this kind of quantum computation. In fact, it hasn't been proved yet that quantum annealing gives an advantage over classical optimization algorithms."

Fake quantum speed-up in D-Wave comes from deliberately choosing bad classical methods (= simulated annealing )

There are various good classical methods (= ex. Selby ) which are faster and more accurate than D-Wave quantum annealing.  So there is No quantum advantage.

Fake quantum speed-up or advantage of D-Wave annealing machines is based on unfair comparison by deliberately choosing bad classical methods such as simulated annealing and Monte Carlo methods.

There are better classical methods such as Selby algorithm that can solve optimization problems much faster and more accurately than D-Wave quantum annealing machines ( this-3rd-paragraph,  this-2nd-paragraph,  this p.15 ).

This 15, 19th-paragraphs say

"So besides simulated annealing, there are two more classical algorithms that are actors in this story. One of them is quantum Monte Carlo, which is actually a classical optimization method,"

"What the Google paper finds is that Selby's algorithm, which runs on a classical computer, totally outperforms the D-Wave machine on all the instances they tested."

This p.6-left-C. says
"Based on the results presented here, one cannot claim a quantum speedup for D-Wave 2X,.. This is because a variety of heuristic classical algorithms can solve most instances of Chimera structured problems (= D-Wave quantum annealing ) much faster than SA, QMC, and the D-Wave 2X.. For instance, the Hamzede Freitas-Selby algorithm "

Classical Selby algorithm can give right answers more quickly than D-Wave quantum annealing.

This research paper says ↓

p.11-1st-paragraph says
"D-Wave fails to compute a ground state for all 20 instances (= D-Wave could Not obtain the lowest-energy ground state solution )"

p.15-7.9-3rd-paragraph says "Selby heuristic (= classical ) did not find the best known solution within the standard allowed time of 30 seconds, but when we allowed 100 seconds instead, the heuristic found the best known solution for all instances but one"

p.15-lower says "#Selby-best-100 number of instances for which the (classical) Selby heuristic found the best known solution within 100 seconds."
"#DW-best number of instances for which D-Wave found the best known solution"

p.16-Table 2 shows the classical Selby (=#Selby-best-100) obtained almost all the best solutions, while D-Wave (= #DW-best ) was unable to find the solutions in many instances.

↑ As a result, D-Wave quantum annealing is unable to obtain the right solution (= lowest-energy ground state ) in many cases, which is far inferior to the good classical Selby algorithm.

D-Wave and Harvard's optimization problems deliberately chose bad classical method to claim fake quantum advantage.

Also in the recent optimization problems of (useless) Ising model, D-Wave and Harvard researchers deliberately picked up bad classical methods such as simulated annealing and Monte Carlo instead of the better classical Selby method to (falsely) claim quantum annealing (= QA ) advantage ( this p.5-left-Benchmarking against simulated annealing,  this p.1-right-1st-paragraph ).

So unlike the media-hype, there has been No quantum speed-up or advantage.

D-Wave quantum annealing (= QA, DW ) is slower and worse than even the bad classical method of simulated annealing (= SA ) in many cases.

This-abstract-lower,  this p.11-2~3rd-paragraphs

 

Quantum computer's supremacy is fake, useless.

Quantum computers' supremacy and advantage are illusion just outputting random meaningless numbers, far inferior to classical computers.

(Fig.S)  The present error-prone quantum computers are unable to give right answers, so physicists tried to hide their errors into random meaningless numbers as fake supremacy.

Quantum supremacy or advantage turned out to be fake, ordinary classical computers are far superior and faster.

In fact, all the claims of quantum computer's supremacy and advantage are fake, useless, Not faster at all ( this-3rd-last-paragraph ).

After Google team announced the so-called quantum supremacy or speed-up using a 53-qubit quantum computer, various teams showed that ordinary classical computers were actually faster than the quantum computer.

Chinese team and Canadian Xanadu's photon quantum computer's advantage was also denied, and classical computers were proved to be faster than quantum computers ( this 2nd-paragraph ).

Even the current largest 127-qubit IBM quantum computer was outperformed by ordinary classical computers, so it was impossible that Google or other teams' quantum computers with far less qubits could outperform classical computers.

Now it is said that over millions of qubits will be necessary to match the current practical classical computers.
The present quantum computers with far less qubits are completely useless and hopeless ( this 4th-paragraph ).

Google quantum computer uses only 53 superconducting qubits consisting of classical IC circuits, No quantum mechanics was involved.

Google used a quantum computer named Sycamore consisting of 53 superconducting qubits called transmon or artificial atoms where two different electric currents with different energies of each classical (very cold) IC circuit including Josephson junction were used to represent each bit or qubit's 0 or 1 state that can be manipulated by classical microwave ( this p.2 ).

Cooper pairs where two negative electrons allegedly attract each other by fictional phonon quasiparticle inside superconductor (= real electrons cannot attract each other ! ) can be explained by classical electrons ( this p.4 ) attracting each other through positive nuclei (= fictional quantum mechanical quasiparticles are unnecessary ).

Quantum computer's (fake) supremacy and advantage experiments can only output random meaningless numbers into which they tried to hide errors.

The problem is the all the present quantum computers are too error-prone to give right answers.

Correcting quantum computer's errors is still impossible ( this 3rd-paragraph ).

This is why all the experiments claiming the (fake) quantum computer supremacy or advantage could just output random meaningless numbers into which physicists tried to hide inconvenient errors (= they tried to take advantage of random meaningless numbers remaining random even if they contain a lot of errors ).

This 8th-paragraph says
"Nothing useful in any practical sense — in fact, it is randomly generated, a random quantum circuit."

Quantum supremacy or advantage claim is fake, based on unfounded assumption that 53 qubits could use 253 unseen parallel universes to get random numbers.

All the dubious quantum supremacy or advantage claims are based on unfounded assumption that each bit or qubit (= just classical circuit ) can take 0 and 1 states simultaneously using imaginary quantum superposition, a dead-alive cat living in fictional parallel worlds.

So Google Sycamore's 53 qubits were said to take as many as 253 different ( unseen ) quantum superposition or parallel universes interfering with each other to obtain some random meaningless numbers ( this-5th-paragraph,  this-4th-paragraph,  this 8th-paragraph ).

But it is impossible to observe such unrealistic quantum superposition states or parallel universes.
All we can observe is only one single qubit's state 0 or 1 (= we can observe only one cat's state dead or alive due to "superposition collapse by measurement",  this 4~5th-paragraphs,  this 5-6th-paragaphs ) in only one single world ( this 26-29th-paragraphs ).

What they call quantum superposition qubit states of 0 and 1 is just transient intermediate state between 0 and 1 which gives only one of 0 or 1 states when measured.

Google quantum computer just illuminated each (classical) qubit by classical microwave, and changed the qubit's state where No fantasied quantum superposition or parallel universes happened.

Ordinary classical computer can take only one bit's state 0 or 1 in one realistic world at the same time, so they (wrongly) claim that quantum computers that could utilize (unseen fictional) 253 parallel universe superposition can not be imitated by classical computers in the (fake) quantum supremacy experiments ( this 6~8th-paragraph,  this middle ).

There is No evidence that the unobservable quantum superposition or parallel universes actually happened in dubious quantum supremacy error-prone or noisy experiments

There is No evidence that such unrealistic quantum superposition (= a dead-and-alive cat ) states or parallel universes actually happened in the process of obtaining random meaningless numbers (= their quantum computer could obtain only one random meaningless numbers at once with No parallel universes ) in the (dubious) quantum supremacy experiments.

Ordinary classical computer can obtain random numbers much more quickly than the current bulky quantum computer, so there is No quantum supremacy or speed-up at all from the beginning.

Furthermore, the present quantum computers are too error-prone to give right answers ( this-last-paragraph ), and error correction is impossible.

Error rate (= or noise rate ) of Google's Sycamore's quantum computer was said to be 99.8% (= only 0.2% fidelity = 1 - error rate,  this p.16-upper,  this p.3-right,  this p.3-right-3rd-paragraph ).

This p.3 (or p.2)-1st-paragraph says
"For example, Google’s recent quantum supremacy experiment estimated that their fidelity was merely ∼ 0.2% (i.e, the experiment was ~ 99.8% noise = 99.8% error rate ) and that their fidelity will further decrease as they scale their system"

It means random meaningless numbers obtained by Google or Quantinuum quantum computers (= still Not computers ) were just results of errors, Not of the fictional quantum superposition or parallel worlds.

Chinese and Canadian Xanadu photon quantum computer's advantage is also fake and useless.

Also in Chinese and Canadian Xanadu photon quantum computer's (fake) advantage experiments, they just randomly detected weak lights or about 100 photons with No meaningful computation, which is completely useless ( this-last-paragraph,  this-8th-paragraph-caveats,  this-last-paragraph,  this-last-paragraph ).

This fake quantum advantage claim is based on the false assumption that (non-existent, imaginary) classical photon balls are indivisible, unable to split into two paths at each beam splitter, while quantum photons (= just weak classical light wave ) could split and interfere with each other (= because quantum photon is just weak divisible classical wave ).

↑ They just detected about 100 random photons at photodetectors without any meaningful computation (= called random boson sampling,  this p.1-right ), and claimed these quantum photons might go through many (unseen, fictional) quantum superposition (= split into many paths ) or parallel universes until measured at photodetectors ( this-scenario,  this 2~4th-paragraphs,  this 3rd-paragraph,  this 2nd-paragraph,  this 3~5th-paragraphs ).

While the fictitious classical photon balls couldn't split nor utilize quantum parallel universes, hence, they wrongly claimed this photon's splitting and interference might be quantum advantage ( this 4th-paragraph,  this 4-5th-paragraphs ).

Fist of all, there are No such things as indivisible classical photon balls, so if we treat the (fictional) quantum photons as ordinary divisible classical light wave, we can naturally explain this random light interference and detection by the classical wave, and the quantum advantage disappears and becomes meaningless.

Photon quantum computers (= still Not computers at all ) are known to be the most error-prone due to massive photon loss ( this p.1-right-1st-paragraph ), so it is impossible for such error-prone photon quantum computers to give right answers or outperform the ordinary practical classical computers.

Classical computers could outperform and beat the (error-prone) quantum computers even if they were unfairly forced to imitate non-existent quantum superposition or parallel universes.

There is No evidence that (fictional, unseen) quantum superposition or parallel universes actually happened, so there were No quantum supremacy nor advantage from the beginning.

Even when classical computers were unfairly forced to imitate such imaginary non-existent quantum simultaneous superposition states or parallel universes, the classical computers could outperform or beat the quantum computers, as I said.

So the idea of quantum supremacy, advantage or speed-up was officially dead ( Grover search algorithm is also meaningless and not faster ).

 

Quantum computer is too error-prone to give right answers, and unable to correct their errors.

Quantum computer's error correction is disastrous, hopeless, unable to correct their errors.  = impractical forever.

(Fig.S)  Quantum error correction is disastrous, impracitcal forever, increasing errors instead of decreasing errors !

Practical quantum computer will need the error rate of less than 10-15, which is impossible to realize.

All the present quantum computers are too error-prone to give right answers.

And unlike the ordinary classical computers such as laptops, correcting these errors in quantum computers is impossible ( this challenge 2,  this 4th~ paragraphs ).

Even the current best quantum computer's error rate of two-qubit gate operation is about 1 ~ 0.1% (= 10-2 ~ 10-3 ), which is far worse than the ordinary practical classical computer's error rate of less than 10-17 ( this p.1-left,  this 2nd-paragraph ).

Practical quantum computer is said to need far less error rate of less than 10-15 ( this 5th-paragraph,  this-3rd-paragraph,  this 2nd-paragraph,  this abstract ), and more than millions of qubits for correcting errors ( this 19~21th-paragraphs,  this 2~6th-paragraphs ), which is impossible to realize forever.

Quantum computer is too noisy and too error-prone to be practical.

This 1st-paragraph says
"This noise leads to computational errors, and today's quantum computers are too error-prone to compute the answers to problems that are of practical utility"

This 9th paragraph says
"They're highly susceptible to "noise" (interference), which leads to computational errors. One of the most common breeds of quantum computers is described as being noisy intermediate-scale devices. They're prone to error,"

Fidelity is equal to 1 - error rate.

Fidelity (= F ) is the opposite of error rate.
Fidelity F = 1 - error rate ( this p.3-left-2nd-paragraph,  this p.59 ).

So when the fidelity is 98%, the error rate is 2% ( this p.1-left-1st-paragraph,  this 18th-paragraph ).

This 3~4th-paragraphs say
"Even with a so-called state-of-the-art fidelity of 99.9 percent, an error still occurs in one out of every 1000 operations (= error rate is 0.1% ). In the realm of quantum computing, where complex algorithms require many thousands or even millions of qubit operations, such error rates are prohibitively high. Conducting long calculations becomes practically impossible without accumulating significant errors that render the results unreliable."

"By contrast, classical computers exhibit minuscule error rates. Thanks to highly reliable semiconductor technology and sophisticated error correction techniques, classical computers achieve error rates as low as one error per quintillion operations (that's one in 1018 )"

Google quantum computer supremacy was unreal due to too high error rate of 99.8%

Each two-qubit gate operation error rate of the Google superconducting quantum computer is about 0.5 ~ 0.6 % (= 5.0 × 10-3 ~ 6.0 × 10-3,  this p.5-left-3rd-paragraph,  this p.3-left-5th-paragraph ), which error rate is still too high to calculate right values.

In the Google's (fake) quantum supremacy experiment (= their supremacy has been denied now ), the total fidelity (= FXEB ) of random qubit operations ( after 53 qubits × 20 random-qubit operation cycles ) was only 0.2 % ( this p.3-right-1st-paragraph ).

↑ It means the final (accumulated) error (= noise ) rate of Google supremacy experiment was 99.8% (= 100 - 0.2 %,  this p.3(or p.2)-1st-paragraph,  this p.16-upper ), which is just an erroneous wrong answer.

↑ As a result, Google quantum supremacy random numbers were just results of errors (= 99.8% error rate ), Not of quantum mechanical meaningful calculations (= though they just did random meaningless operations ), so No evidence of quantum supremacy.

Quantinuum 56-ion-qubit random sampling was also error-prone with more than 80% error rate.

The recent Quantinuum's 56-ion-qubit quantum computer (= still Not a computer ) was also too error-prone to give right answers (= this is why they could deal with only random meaningless numbers ).

This research paper ↓

p.3-left-3rd-paragraph says "the 2Q (= 2-qubit ) gate fidelities on H2 (= their ion-qubit quantum computer ) up to the 99.9% level (= each gate error rate was 0.1% ) achieved on H1 (= this error rate per 2-qubit gate operation is still too high )"

p.13-Fig.9-(a) shows the total fidelity of Quantinuum 56-ion-qubits (= N = 56 ) after 20 qubit gate operations (= depth = 20 ) was only less than 0.2 (< 20% ), which means the total (accumulated) error rate was more than 80%.

p.13-right-2nd-paragraph says "none of these results (hardness in the absence of noise or easiness at fixed noise rate) is conclusive (= No evidence of quantum supremacy )"

↑ Even the latest quantum computer with the best (= lowest ) error rate could Not give right answers (= the final error rate was more than 80% ), which is why they focus only on random meaningless numbers into which errors could be hidden.

IBM, Harvard quantum computers are also too error-prone to give right answers.

IBM 127-qubit quantum computer also had too high error rate (= error rate per 2-qubit gate operation was 1.17 × 10-2, this p.4-Fig.S1 ), which could Not give right answers.

IBM started to rely on illegitimate error-mitigation by classical computer without quantum error correction ( this 2nd-last-paragraph,  this 2nd-last~3rd-last paragraphs ).

Harvard-QuEra latest 60-atomic qubits also had too high error rate (= error-rate per 2-qubit gate operation was 0.5% = 5.0 × 10-3, this abstract ), which error rate cannot give right answers, so they started to rely on ad-hoc post-selection without error correction.

Topologically-protected robust quantum computer based on fictional quasiparticle is unreal, lacking evidence of "robust".

There is No evidence that the (illusory) topological quantum computer is robust against errors.

Because they have Not realized even one single topological qubit allegedly based on (fictional) Majorana or anyon quasiparticles, let alone the hyped robust quantum computer.

One logical qubit used for calculation must consist of multiple physical qubits for correcting errors.

(Fig.Q)  One logical qubit for calculation consists of multiple physical qubits (= not used for calculation directly ) for detecting errors.

One logical qubit consists of multiple physical qubits for correcting errors.

To correct errors, each (logical) qubit used for calculation must consist of multiple (physical) qubits.

When some physical qubits inside one logical qubit show errors, they correct these errors from the remaining intact physical qubits like a majority vote ( this 8~21th paragraphs ).

So physical qubits are used for error detection instead of being used directly for calculation.

Only logical qubits (= each logical qubit consists of multiple physical qubits ) can be used for actual calculation (= one logical qubit can express only 0 or 1 bit state, and just one physical qubit is not allowed to express 0 or 1 bit state for calculation ).

The current error-prone qubits can Not implement this error correction that is said to need more than millions of qubits ( this 14~16th-paragraphs,  this p.1-right-1st-paragraph ), which is impossible to realize.

This abstract says
"full fault-tolerant quantum computing (FTQC) based on quantum error correction code remains far beyond realization due to its extremely large requirement of high-precision physical qubits"

One logical qubit consists of multiple data physical qubits and error-detection ancilla qubits in the tricky quantum error correction.

The problem is quantum computer is Not allowed to measure each qubit directly, because the measurement is said to destroy the quantum superposition or parallel-world state ( this p.1-left ).

So in quantum computer, each logical qubit consists of multiple data qubits (= for storing data for calculations ) and multiple ancilla or stabilizers qubits which are 'entangled' with data physical qubits, and used for detecting errors of data qubits.

And when these error-detection ancilla physical qubits are measured to be show some error flag sign, the corresponding data qubits that are expected to have errors are corrected indirectly ( this middle ).

This "Quantum error correction with surface codes" says
"Bob wants to send Alice a single bit that reads "1" across a noisy communication channel. Recognizing that the message is lost if the bit flips to "0", Bob instead sends three bits: “111 (= one logical qubit consisting of 3 physical qubits )”. If one erroneously flips, Alice could take a majority vote = error can be corrected like 101 → 111 (= a simple error-correcting code )"

"we arrange two types of qubits on a checkerboard. Data qubits (= used for calculation ) on the vertices make up the logical qubit, while measure qubits (= error detection qubits ) at the center of each square are used for so-called stabilizer (= error detection ) measurements. These measurements tell us whether the qubits are all the same, as desired, or different, signaling that an error occurred"

Quantum computer's error correction is useless, just increasing error rates instead of decreasing errors.

The serious problem is that the current quantum computers are too error-prone to correct errors.

So when physicists try to detect and correct errors, ironically, the error rates increase or worsen instead of decreasing at each error-correction operation (= error-worsening operation ).

In Google's latest error correction experiment (= actually, they did Not correct errors, this 3rd-paragraph ), one logical qubit consisted of 49 physical qubits, whose error rate increased by 3.0% each time error correction operation (= cycle ) was conducted ( this 3rd-last-paragraph,  this 3rd-last-paragraph ).

This 9~10th paragraphs say
"surface code uses 25 data qubits, plus 24 measure qubits to make a total of 49 in one logical qubit.. ( 2.914 percent logical errors per cycle )"

This Google research paper p.4-Fig.3-b shows logical (qubit) fidelity decreased to only 0.2 (= which means error rate increased to 80% ) after 25 error correction cycles, instead of decreasing errors.

↑ This is Not an error correction but an error-worsening operation.

Error rate increases by more than 2.70% per error correction operation (= which is useless ) instead of decreasing.

Also in Honneywell's 10-ion-qubit experiment, the total error rate increased by 2.70% per quantum error correction operation (= QEC, this p.4-Fig.3, p.4-right-1st-paragraph ) instead of decreasing.

The recent Chinese research also showed the impractically-high error rate 13% (= fidelity was 0.877 ) of a logical qubit consisting 17 superconducting physical qubits ( this 4th-last-paragraph ) in preparing a little complicated qubit's state called magic (= non-Clifford ) state where probabilities of a qubit's 0 and 1 are a little complicated ( this p.8-10 ).

This research paper ↓

p.3-left-Experiemntal implementation says "using 17 out of the 66 qubits on the Zuchongzhi 2.1 superconducting quantum system. This 17-qubit distance-three surface code consists of 9 data qubits, 4 X-type ancilla qubits and 4 Z-type ancilla qubits (= each logical qubit consist of 9 data physical qubits and 8 ancilla qubits )"

p.4-right-(5)-2nd-paragraph says the logical (qubit) fidelity of the magic state was 0.877 (= error rate was 13%, which was too high to be practical ).

p.4-right-3rd-paragraph says ". After the error correction procedure, the fidelity of the logical states at each point is improved and the logical error rates per round are reduced to 24.77% and 25.65%, respectively (= their error rate increased from the initial 13% ! )"

↑ It means the total error rate increased by 24~25% per error correction round ( p.5-Fig.4 shows fidelity decreased per round )

As a result, the current quantum computer's error correction just increased the total error rate instead of decreasing errors due to too error-prone qubits, which is completely useless.

Ad-hoc Post-selection = physicists gave up correcting errors in the current extremely error-prone quantum computers.

Physicists gave up error correction, and instead, try to illegitimately post-select only qubits that can manage to avoid errors (= discarding all the remaining qubits ).

The current quantum computers are so error-prone that their error correction is impossible.

So physicists started to rely on illegitimate post-selection methods without error correction ( this 6th-paragraph,  this 4th-last-paragraph ).

In this ad-hoc post-selection methods, they unreasonably discard all qubits that show errors, and post-select only intact qubits without performing error correction operation ( this p.15 ).

This p.1-left-3rd-paragraph says
"which means that detected errors cannot be uniquely identified and corrected. We must therefore rely on postselection to find and discard cases where an error occurred."

This p.7-left-2nd-paragraph says
" this experiment uses post-selection rather than correction"

This post-selection method is completely impractical.

Because in large calculations, almost all qubits in the current error-prone quantum computers show errors and must be discarded.

Only a tiny percent of qubits that can manage to avoid errors remain.  ← It is impossible to calculate large numbers just by these small numbers of post-selected remaining qubits.

Microsoft, Quantinuum 20-ion qubits, which cannot correct errors, just post-select qubits.

The recent Microsoft-Quantinuum 20-trapped ion quantum computer (= still Not a computer ) just artificially pre-selected ( this 4th-paragraph ) and post-selected only qubits that luckily avoided errors without error correction.

This research paper ↓

p.2-right-3rd-paragraph says "Each logical qubit has seven data qubits and three ancilla qubits, leading to the experiments having a total of 20 physical qubits"

p.2-right-4th-paragraph says "Upon failure to verify the preparation a logical |0⟩ state, the qubits can be conditionally reset and the fault-tolerant preparation can be re-attempted in a repeat-until-success fashion or pre-selected upon verification"

p.3-right-last-paragraph says "Post-selecting on non-trivial syndromes (so No error correction is performed, only error detection was done ), the total acceptance rate goes from ≈ 75% to ≈ 72% (= other 25% qubits were discarded )"

↑ Due to inability of quantum computers to correct errors, physicists started to rely on artificial discarding erroneous qubits and post-selecting the remaining intact qubits without error correction, which is useless.

Harvard-QuEra (deceptive) fault-tolerant quantum computer just post-selects qubits instead of error correction

See this.

Caltech's "erasure" errors also just discarded qubits that showed errors instead of correcting errors, which is impractical.

The recent Caltech's "erasure" errors published in Nature is also misleading, because they did Not correct the detected errors at all.

They used only less than 20 atomic qubits (= that is far from millions of qubits required for practical quantum computer ) where two atomic energy levels (= pseudo-ground and excited states ) were used as each qubit's 0 (= g ) and 1 (= r ) states ( this p.1-Fig.1 ).

And they just discarded (= or erased, excised,  this p.2-left-last-paragraph, p.9-statistics reduction due to erasure excision ) all qubits that showed errors, instead of correcting errors.

This experiment just tried to prepare two-qubit Bell-state (= one atomic qubit is ground-state and the other atomic qubit is the excited state = 00 or 11 ) without any computation.

Even after artificially discarding these qubits showing errors (= instead of correcting errors ), their final error rate is as bad as 0.4% (= fidelity is 0.996,  this-4th-last-paragraph,  this p.2-Fig.1-right ), which is far from the practically-required error rate of 10-15.

↑ "SPAM corrected" also just discard error-qubits and post-selected only other intact qubits without correcting errors ( this p.7-Fig.5 ).

The error-correction of such useless quantum computers is still just pie-in-the-sky empty theory with No experimental realization ( this 5th-last-paragraph,  this 4th-last-paragraph,  this 4th-last-paragraph,  this p.32-conclusion,  this 7th-paragraph ).

 

The current largest quantum computer of IBM is error-prone, useless, far inferior to ordinary classical computer

IBM's current largest quantum computers are easily outperformed by ordinary classical computers.  ← Quantum computers are dead.

(Fig.I)  IBM quantum computer is unable to get right answers because of its high error rates, and ordinary classical computers are far faster and more accurate than quantum computers.

Quantum computers are useless, error-prone, unable to give right answer, so quantum advantage is just joke.

Contrary to a lot of hypes such as quantum supremacy and utility, all the current quantum computers are so error-prone that they always give wrong answers.

So practical quantum computer's calculation is impossible, let alone quantum advantage.

The current largest quantum computers of IBM are also too error-prone to give right answers.

To calculate accurately and get right answers, the error rate must be reduced to less than 10-15 ( this 3rd-paragraph,  this abstract ), but IBM quantum computer's two qubit error rate is much worse and higher = 10-2 ( this p.4 Fig.S1-two-qubit-error per gate = EPG of IBM is 1.17 × 10-2 far higher than 10-15 required for accurate calculation ).

Error correction of quantum computers is unrealistic, impossible forever ( this p.2-introduction-2nd-paragraph ).

IBM gave up correcting errors of its error-prone quantum computers, instead, they tried to rely on illegitimate "error mitigation" model.

IBM gave up quantum error correction, and tried to rely on the ad-hoc illegitimate error-mitigation model in the recent paper in Nature using 127-qubit IBM Eagle quantum computer ( this p.7-left-noise model ).

In this illegitimate error-mitigation model, IBM paradoxically increased errors, and tried to extrapolate or guess the right answers from the intentionally-increased errors, using artificial error mitigation model.

The 10th paragraph of this or this say

"Paradoxically, IBM researchers controllably increased the noise in their quantum circuit to get even noisier, less accurate answers and then extrapolated backward to estimate the answer the computer would have gotten if there were no noise."

↑ It is impossible to get right answers from the wrong erroneous numbers without error correction, unless they knew the right answers from the beginning.

↑ This error mitigation model had to be estimated by ordinary classical computer ( this 7th-paragraph,  this p.1-right-2nd-paragraph ), because all they could do was make the original error-prone quantum computer noisier and less accurate (= only classical computers were reliable errorless machines ).

The 2nd-last paragraph of this and this says
"Some researchers are less optimistic about the potential of noise mitigation (= illegitimate ), and expect that only quantum error correction (= still impossible ) will enable calculations that would be impossible on even the largest classical supercomputers"

This paper p.3-Fig.2 shows IBM 127-qubit quantum computer always gave wrong answers.
In this unmitigated (= green ) values (= raw results ) obtained by IBM quantum computer were completely different from the ideal gray-dotted line or right answers.

So this IBM error mitigation method without performing quantum error correction is irrelevant to quantum computer's advantage or utility.

IBM quantum computer consists of classical IC-like circuits controlled by classical microwave, No quantum mechanical superposition.

IBM quantum computers use superconducting transmon qubits which are just classical IC circuits using Josephson junction ( this p.2 ).

This superconducting qubit or classical (IC) circuit's two energy levels based on the different amount of charges or current are treated as each qubit's 0 or 1 state, which can be changed or controlled by classical microwave pulse through classical circuits.

This quantum bit or a qubit is said to take 0 and 1 bit states simultaneously using quantum superposition or parallel universes, which has No evidence.

There is No evidence that quantum computer's qubits use (unseen) superposition or parallel worlds for quantum advantage

Dubious quantum computer's utility or advantage is based on the unfounded assumption that each qubit takes 0 and 1 bit states simultaneously, or IBM's 127 qubits could take 2127 superposition or parallel universes in the process of qubit's operation by microwave.

Ordinary classical computer cannot utilize this (fictional) quantum superposition or parallel universes for simultaneous calculations, so they (baselessly) claim quantum computer is faster and superior to classical computer.

But all they can observe is only one 0 or 1 state in each qubit, and only one bitstring as a final answer even in the quantum computer.

No evidence of quantum superposition or parallel universes, so there is No quantum supremacy, advantage or utility from the beginning.

IBM 127-qubit quantum computer in Nature is said to have used (unseen) 2127 parallel universes, but No evidence.

In the recent IBM 127-qubit quantum computer research published in Nature, they conducted multiple one or two qubit operations on 127 superconducting qubits by applying microwave pulses based on impractical qubit-interaction model called Ising model ( this p.2-left, Fig.1 ).

They claim IBM 127-qubit quantum computer harnessed (unseen) 2127 quantum superposition or parallel universes to obtain final answers that were said to be hard to obtain by ordinary classical computer using only one single world.

But as I said, the error-prone IBM quantum computer could Not give right answers, so there was No evidence that they really used (unseen) quantum superposition or parallel universes.

Classical computers unfairly forced to imitate such (unseen, baseless) 2127 superposition states, could give more accurate answers more quickly than IBM dubious quantum computer.

Despite the unreliable evidence of quantum superposition or parallel universes, a classical computer was unfairly forced to imitate or calculate as many as 2127 different superposition or parallel universes allegedly utilized by IBM 127-qubit quantum computer in Ising model ( this p.3-7 ).

This classical computer's method imitating (unseen) quantum superposition or parallel worlds is called tensor network

Even in this unfair condition, the ordinary classical computer (= laptop, this abstract ) could calculate more accurate answers much more quickly than the error-prone IBM quantum computer relying on illegitimate error-mitigation model ( this 6-11th-paragraphs ).

So quantum computers were officially proved to be much slower and inferior to classical computers.

Quantum computer advantage or utility turned out to be just hype and fake.

IBM 1121-qubit quantum computer is error-prone, so useless.

Recently, IBM announced the largest 1121-qubit quantum computer.

But IBM is silent about how error-prone this largest quantum computer is (= because it is their inconvenient truth ).

Actually, the latest IBM research used only 27 qubits of very old machines ( this p.4-left-last-paragraph ).  ← Because all other IBM quantum computers with more than 27 qubits are more useless and error-prone, as shown in the recent protein-prediction hype.

There is No news that IBM improved their error rates, which means their largest quantum computers are still error-prone, useless, unable to give right answers.

Furthermore, even this alleged world's largest quantum computer of IBM is still far from millions of qubits required for practical quantum computers ( this-summary,  this 4th-paragraph ).

Quantum computers are officially dead.

Chinese largest 504-superconducting-qubit quantum computer is also useless, error-prone, far from millions of qubits required for the practical quantum computer.

The recent Chinese largest quantum computer using 504 superconducting qubits is also useless, far from the practical quantum computer that is said to need millions of qubits.

This 4-5th-paragraphs say
"its gate fidelity (= 1 - error rate ) and the depth of its quantum circuit, are expected to reach (= still unclear ) the chip performance levels of main international cloud-enabled quantum computing platforms such as IBM (= these present quantum computers' fidelity and error rates are too bad to give right answers )."

"the main purpose of the chip is to promote the development of large-scale quantum computing measurement and control systems, rather than to aim for higher computing power and quantum supremacy (= it is impossible that the current error-prone impractical quantum computers are superior to the useful classical computers )."

 

Quantum computers are completely useless for calculating molecular energies or discovering drugs.

They force practical classical computer to perform most of complicated molecular energy calculations, and try to make useless quantum computer take all credits of classical computer as (deceptive) hybrid computer.

(Fig.D)  Quantum computers with small numbers of qubits and high error rates are Not useful calculators at all.  ← Quantum computer's drug discovery will never happen.

Quantum computers are useless for any atomic or molecular energy calculations.

The current quantum computers are useless, too error-prone to calculate any meaningful values.

This is why physicists had to create (deceptive) hybrid quantum-classical method, which is substantially a classical computer, and the useless quantum computer can do almost nothing.

Quantum advantage is impossible also in molecular energy calculation.

For quantum computers to be practical or show advantage, more than millions of qubits are said to be necessary ( this 1st-paragraph ).

This is impossible forever.

Most of the current quantum computers have only less than 50 qubits or 50 bitstring, which cannot calculate any meaningful values.

This 2nd-paragraph says
"It is said that a quantum computer with a scale of 1 million qubits is needed to realize ultrafast calculations that are impossible with conventional computers, such as quantum chemical calculations."

Useful quantum computer is said to need millions of qubits, which is impossible.

The 1st, 7th, 10th paragraphs of this news say

"But so far none have conclusively achieved quantum advantage"

".. Small is key because estimates indicate that a world-changing quantum computer will need one million to 10 million qubits"

"Companies that are exploring this technology have popped up in the past few years, including Alpine Quantum Technologies (which has a computer with 24 qubits), IonQ (which has 29 qubits) and Quantinuum (which has 32 qubits)"  ← far from millions of qubits required for quantum advantage.

The current impractical quantum computers with only small numbers of qubits (< 50 qubits ) need overhyped fake news.

More than millions of qubits are necessary to achieve true quantum advantage ( this-4th-paragraph ).

So this kind of news falsely claiming quantum advantage just by outputting random useless numbers using only 56 error-prone qubits is overhyped fake news.

In the current energy calculation of molecule or protein (= this is hype, they actually dealt with only 7 amino acids ), only very small numbers of qubits less than 20 impractical qubits were used as (deceptive) hybrid quantum-classical methods.

Almost all the important molecular energy calculations must be done by ordinary classical computers in this hybrid quantum-classical methods where only "quantum computer" part is exaggerated and hyped.

No quantum computer advantage in molecular energy calculation.

Imaginary quantum advantage is based on fictional parallel-world simultaneous calculations (= quantum phase estimation or QPE ), which is impossible forever.

The still-unrealized quantum advantage for finding true (lowest) molecular energy is counting on imaginary quantum superposition or parallel worlds in the method called quantum phase estimation (= QPE ).

In this (impractical) quantum phase estimation, the initially-chosen trial wavefunction (= just vague guess,  this p.14 ) is said to change into multiple different wavefunctions using (fictional) quantum superposition ( this 2nd-paragraph ) or parallel worlds.

And they unrealistically think the molecular lowest true ground-state energy may be left by interfering among these multiple imaginary parallel-world wavefunctions, which method is called quantum fourier transformation (= QFT, this 1~4,  this phase kickback ).

↑ This quantum computer's advantage or speed-up is impossible and unrealized, as shown in Shor-algorithm factoring ( this p.1 introduction ).

Quantum computer has to rely on adiabatic optimization like D-Wave annealing to find the lowest ground-state energy, which has No advantage.

In order to find the true lowest ground state energy of a molecule (= in methods such as quantum phase estimation ), quantum mechanics has to rely on adiabatic optimization or annealing like D-Wave ( this p.14-last,  this p.2, 7-lower ) for guessing the good initial state, which was proved to have No quantum advantage nor speed-up.

So whatever quantum methods physicists try to employ, quantum computers can never show quantum speed-up nor advantage over classical computer.

Using only quantum computers to calculate molecular energy (like in QPE) is impossible due to a lot of errors.

This recent paper ↓

p.1-abstract says "VQE has been proposed as an alternative to fully quantum algorithms such as quantum phase estimation (QPE) because fully quantum algorithms require quantum hardware that will not be accessible in the near future"

p.2-2nd-paragraph says "For example, the resources estimated to perform chromium dimer calculations, which is a large enough molecular system to demonstrate the quantum advantage on a quantum computer, with existing algorithms require at least 1 million physical qubits"

This paper also says ↓

p.1-left says "Some of them rely on fault-tolerant quantum hardware computation, such as the Quantum Phase Estimation (QPE) algorithm, which is currently Not feasible"

p.1-right-2nd-paragraph says "A practical quantum advantage, i.e., an advantage of quantum computation over classical computation in an industrially relevant problem has however Not yet been demonstrated"

This 3rd paragraph says "the QPE algorithm requires deep circuits with ancilla qubits, which are difficult to execute in the noisy intermediate-scale quantum (NISQ) era due to limited quantum resources (= the current noisy, error-prone quantum computers cannot carry out this QPE,  this p.2-2nd-paragraph )"

↑ This Chinese research just did classical computer's simulation of only N = 2 ~ 4 qubits instead of using real impractical quantum computers ( this p.5-Fig.3 ).

Quantinuum use only 2 impractical ion qubits for fake simulation of a H2 molecule.

The 2nd, 7-8th paragraphs of this recent hyped news say

"Quantum computing company Quantinuum said it was able to simulate a hydrogen molecule by using a "fault tolerant algorithm (= Not error-correction )" on the company’s H1 quantum computer."

"In a preprint study, the team claims to have overcome the challenge by using an error-detection code which saves quantum resources by immediately discarding calculations if the code detects qubits that have produced errors (= detect errors → discard and give up this calculation without performing error correction )."

"The team used this code to make three logical qubits (= only 3 qubits or 3 bitstring 001, which bit number is too small to calculate any molecular energy ) which are composed of multiple physical qubits."

Quantum computer with only several qubits (= still Not a computer ) cannot calculate any molecular energies without the help of practical classical computer.

This Quantinuum research paper ↓

p.1-right-(1), 2nd-last, last-paragraphs and footnote says "In the present work, we demonstrate Bayesian QPE on a quantum charge-coupled device trapped-ion quantum computer."

".. where Xi, Yi, Zi are the Pauli-X, Y , Z operators acting on ith qubit (= which means this research used only two impractical ion qubits ), I is the identity operator, and {hj} are real coefficients."

"..The spin Hamiltonian (= fake simplified Hamiltonian energy artificially created for a small numbers of qubits ) is obtained with the Jordan–Wigner transformation"

"(footnote) The coefficients hj are evaluated on classical computers by performing the restricted Hartree–Fock method (= classical computer was needed to calculate molecular energy after all )"

Quantinuum's only 2 impractical qubits could not correct errors, instead, they just discarded erroneous results.

↑ The same Quantinuum research paper ↓

p.2-left-2nd-paragraph says "The code detects an arbitrary single-qubit error and discards the associated result (= without error correction ) to reduce the error rate at the cost of more circuit executions"

p.2-left-(2) shows only 2-qubit operation

p.5-left-last-paragraph says "We use InQuanto v2.1.1 and its interface to pyscf v2.2.0 to calculate the coefficients of the spin Hamiltonian (1) on classical computers. The classical pre and post-processing for parameter selection (= almost all molecular energy calculation was done by classical computer, Not by the impractical quantum computer with only two ion qubits )"

p.6-left-5th-paragraph and Figure 4 show rate of discarding erroneous results increased to 0.7 (= 70% of all calculated results were discarded due to errors ) as the number of qubit operation (= depth k ) increased, which is useless.

p.13-4. used only up to 4 qubits also in noiseless simulation by classical computer ( this p.3-1st-paragraph ).

Quantum computers cannot simulate chemical reaction at all.

Only one impractical ion qubit can Not simulate chemical reaction nor photosynthesis in conical intersection.

Various news sites spread overhyped news that researchers used a quantum computer to observe some chemical reaction called conical intersection related to photosynthesis.

But this is completely false.
Quantum computer is completely useless and unable to simulate any meaningful chemical reactions.

This research paper ( this ↓ ).

p.2-Figure 1-g says "in an ion-trap quantum simulator with a single 171Yb+ ion (= only one single ion qubit 0 or 1, Not a computer or simulator at all )."

p.7-Experimental setup says "The 171Yb+ ion (= only one ion ) is confined in a Paul trap.. We use two laser beams derived from a 355 nm pulsed laser to coherently control the qubit"

↑ This research just trapped only one single (impractical) ion by controllable external electromagnetic field called Paul trap, and manipulated the ion by two laser light.

That's all. No quantum computing nor simulation of chemical reaction was done, much less photosynthesis (= just one single bit or ion can Not calculate nor simulate anything ).

Conical intersection is a very vague (useless) concept of two energy states intersecting ( this p.8 ).

Quantum computer cannot identify single nucleotides.

The 4th, 6-7th paragraphs of this hyped news say

"the researchers used a quantum computer to distinguish adenosine from the other three nucleotide molecules (= false, Not a quantum computer but electrodes measuring electric conductance through nucleotides distinguished or identified the kinds of nucleotides )."

"The researchers used electrodes (= Not a quantum computer ) with a nanoscale gap between them to detect single nucleotides. The output of current versus time for the adenosine monophosphate nucleotide differed from that of the other three nucleotides"

"Variations in (electric) conductance depend on molecular rotation patterns that are unique for each nucleotide"

↑ This research tried to connect this electrode with only three impractical (superconducting) qubits with high error rates.

As shown in this p.15-2nd-paragraph and p.16, the success rate of identifying nucleotides by these 3 impractical qubits (= connected to electrodes ) was only 77.8% (= error rate was 22.8% ), which was worse than the success rate 95% of the quantum simulator (= which is classical computer ) connected to electrodes.

↑ Quantum computer was useless with No advantage.

Quantum computer is completely useless for health care.

The research shown in the 2nd-paragraph of this hyped news did Not use quantum computers, contrary to hypes.

This research paper ↓

Abstract says "To execute the grid search, our research employed a classical optimiser (= this research used a classical computer, Not a quantum computer )"

Figure.2, Figure 4, 4.2 says "The authors proceeded to train a VQC (= hybrid classical-quantum ) as a classifier within Qiskit (= classical computer simulator of qubits, Not a quantum computer ) machine learning principles."

↑ As a result, contrary to various hyped news, No quantum computers were useful for molecular energy calculation nor health care.

Quantum computer is completely useless for Alzheimer's diagnosis.

Insider Brief, the 4th-last~last paragraphs of this hyped news on the alleged quantum computing for Alzheimer's diagnosis say

"A hybrid classical-quantum (= which is substantially a classical computer, and the useless quantum computer could do almost nothing ) approach might (= just speculation, still useless ) boost the accuracy and efficiency of Alzheimer’s disease diagnosis."

"The experiments were conducted using a Hewlett Packard Core i5, sixth-generation computer with 8 GB RAM, and a Google Colab Pro GPU (= this research used ordinary classical computers for almost all calculations )."

"..On the quantum side, the researchers relied on a 5-qubit (= just 5 qubits or 5 bitstring 01101 is useless, unable to calculate anything, Not a quantum computer at all ) quantum hardware or simulator (= using classical computer for simulating imaginary quantum computer ), employing the QSVM model from the Qiskit library"

"However, the researchers acknowledge the need for further studies to evaluate the practical implementation of this model within medical devices (= still impractical )"

Only 5-qubits is useless, Not a quantum computer.

This research paper ↓

p.3-Methods say "In this paper, we introduced a method based on ensemble learning and quantum machine learning classification algorithms (= quantum algorithms do Not mean the use of quantum computers ) that analyze MRI brain images and extract meaningful features for successful classification of AD stages"

p.8-Conclusion and recommendation say "We used 5 qubit quantum hardware (= just 5 qubits is useless, Not a computer ) or simulator (= classical computer simulator of imaginary quantum computer was also used ) and we utilized the QSVM model from the Qiskit library "

↑ This research used only 5-qubit quantum computer (= each qubit takes 0 or 1, so 5-qubits can express only 25 = only 1 ~ 32 numbers which can be easily calculated much faster by the ordinary classical computer, so quantum computer is completely unnecessary for this alleged Alzheimer diagnosis ).

Actually this research also conducted the classical computer's simulation of the imaginary 5-qubit quantum computer by Qiskit algorithm of QSVM (= quantum support vector machine,  this p.7,  this p.27-simulators ).

This classical computer simulator of the imaginary quantum computer is usually far better than the actual (useless) error-prone quantum computer ( this p.6-right-Table 4 shows the IBM QASM classical computer simulator was much faster than the IBMQ_16_Melbourne quantum computer ).

As a result, the quantum computer with only a small number of qubits (= only 5 ) is completely useless for Alzheimer diagnosis, contrary to the overhyped news.

There is No such thing as a quantum simulator.

Physicists just oscillated small numbers of atoms or ions disorderly with a lot of errors, and called them (fake) quantum simulators that are useless.

It is impossible for today's error-prone quantum computers with only small numbers of qubits to calculate or simulate some meaningful physical phenomena.

So all the dubious news about the alleged quantum simulators is just fake, untrue and useless.

 

All types of qubits are hopeless, impractical forever.

(Fig.Q)  All types of quantum computers are extremely error-prone, unstable and impossible to scale up.

Photon quantum computer is most error-prone, impractical forever

See this page.

Other qubits

Most major companies such as Google and IBM focus on another-type quantum computer using superconducting qubits (= just classical circuits ) which also cause impractically-high error rates ( this 5th-paragraph,  this p.1-introduction-2nd-paragraph ), and the necessity to always cool their bulky qubits (= each qubit is as big as millimeter ! this 3rd-last-paragraph ) to almost absolute zero makes it impossible to realize the (imaginary) practical quantum computers that will need millions of qubits ( this 4th-paragraph,  this 4th-paragraph ) forever.

D-Wave annealing machines for optimizing problems are Not true quantum computers nor faster than classical computers.

Topological quantum computer trying to use an illusory quasiparticle with fractional-charge called anyon or Majorana is unreal, fraudulent and impractical forever, Not achieving even a single qubit yet.

↑ The latest topological quantum computer is fake and unable to realize the true fault-tolerant quantum computer based on imaginary braids (= unseen non-existent braids ) at all ( this 9~10th paragraphs ).

Fluxonium qubits are useless.

MIT's new fluxonium qubits (= only 2 qubits !) are still error-prone, far from a (illusory) practical quantum computer that will need at least millions of qubits.

The current leading quantum computer companies such as IBM and Google use superconducting quantum bits or qubits (= which consist of just classical circuits, ) called transmon ( this p.6-left-1st-paragraph,  this 14th-paragraph ) whose error rate is the lowest (= still high, though ) among all types of qubits.

↑ But even this "best" transmon qubit suffers extremely high error rates, whose two-qubit gate error rate is 0.5 ~ 1% or 0.005 ~ 0.01 (= fidelity is 99.5 ~ 99 % ), which error rate is much higher than the current ordinary practical classical computers whose error rate is only less than 10-17 ( this 2nd-paragraph,  this 17th-paragraph,  this last-paragraph ).

The recent (hyped) news claimed that MIT researchers created (only) two qubits of new type of superconducting qubit called "fluxonium ( this Fig.1 )" whose two-qubit-gate error rate is 0.001 (= 99.9% accuracy or fidelity,  this 6th-paragraph ), which is still far worse than classical computer and the (illusory) practical quantum computer whose error rate must be less than 10-15 ( this 5th-paragraph,  this 3rd-paragraph,  this 7th-paragraph ), and the practical quantum computer will need at least millions of qubits ( this 19-21th paragraphs ) which is impossible to realize forever.

↑ In fact, this fluxonium qubit studied by MIT is Not new. However, No quantum computer companies wanted to replace their conventional transmon qubit by this fluxonium qubit ( this 2nd-last~3rd-last paragraphs,  this p.1-right-2~3rd paragraphs ).

Because this allegedly-new-type of fluxonium qubit is much more complicated ( this 5th-paragraph ) and much slower than the conventional transmon qubit, hence scaling up or increasing the number of these fluxonium qubits is impractical and impossible ( this 2nd-last-paragraph,  this-last~10th-last paragraphs,  this p.2-left-2nd-paragraph ).

↑ The time needed to operate two-qubit (= CZ ) gate in fluxonium qubits is about 2~3 times longer (= which means 2~3 times slower than conventional transmon qubit.  fluxonium two-qubit's gate time is 85 ~ 200 ns,  this last paragraph,  this p.7-left-2nd-paragraph,Fig.4,  this p.1-right-lower ) than the conventional transmon qubits whose gate time is 20~35 ns ( this p.2-left,  this Fig.1b,  this p.2 ).  ← No advantage, and hopeless situation of quantum computers is unchanged.

Ion qubits are hopeless.

Ion-qubit quantum computer of Quantinuum is extremely slower and unable to be scaled up to the practical computer.  ← useless forever.

Trapped-ion quantum computers using unstably-floating ions (= two energy levels ) as qubit are known to be very slow, impractical, still error-prone and impossible to scale up.  = still less then only 50 ion qubits ( this 11th-paragraph ).

The recent hyped news claimed that Quantinuum's H1 ( consisting of unstably-floating trapped ion qubits ) might successfully excecute fault-tolerant algorithm (= untrue ).

↑ Even in this Quantinuum latest ion-qubit computer consisting of only less than 20 (ion) qubits (= fall far short of million-qubit practical computer ), its error rate is extremely high (= each 2-qubit gate error rate is 2 × 10-3 or 0.13 %,  this p.3 ), and the error rate even after the so-called "(deceptive) fault-tolerant" operation is still much higher (> 1.0 × 10-3,  this 6th-paragraph ) than the error rates required for the practical computer (= practical computer's error rate must be less than 10-15,  this abstract,  this p.1-left-lower ).  ← still far from practical use.

This ion-qubit quantum computers adopted by Quantinuum and Honeywell can never be put to practical use like all other hopeless quantum computers.  ← Physicists have already known this inconvenient fact, but had No choice but to hide this disastrous truth under the current unrealistic dead-end mainstream physics.

First, it is impossible to scale up or increase the number of ion qubits to the practically-required million qubits, because precisely manipulating many unstably-floating ions (whose two energy levels are used as the qubit 0 and 1 states ) by many lasers is unrealistic ( this 12-13th-paragraphs,  this p.1-intro-1st-paragraph, this 2nd-paragraph,  this trapped-ion qubit section ).
Even the latest quantum computer consists of only less than 50 (ion) qubits ( this 4th-paragraph ).

Second, this ion-qubit quantum computer is more than 1000 times slower (= execution time is far longer ) than the current dominant superconducting qubits used by Google and IBM ( this p.4-right-3rd-paragraph ).

The gate time required for performing each two-qubit gate operation in ion qubits is longer than microseconds or μs (> 1000 ns,  this p.6,  this 2.2,  this p.2-left-1st-paragraph,  this p.5-right-lower ), which is far slower and longer than the gate time of superconducting qubits (= less than 50ns,  this p.2-left,  this Fig.1,  this p.11-Table.4 ).

Furthermore, this latest Quantinuum's H1 ion-qubit quantum computer could Not correct errors, instead, they just artifically "post-select" results for seemingly removing errors ( this p.7-left ), which ad-hoc method is inapplicable to the practical larger computer.

Even in the latest (exaggerated) research in 2023/11/29, they used only 51-ion-qubits (= this 5th-paragraph, ) with extremely high error rates = the error rate of preparing the initial state is as high as 25% (= fidelity is 0.75 ) and each two-qubit error rate is 2% (= fidelity is 0.98,  this p.8-appendix C,  this-middle two-qubit gate error ), which is far from the practical quantum computer requring million qubits and less than 10-15 error rate.

Also in the latest research in 2023/12, they could achieve only (impractical) 32 ion qubits with a lot of unsolved issues ( this 1st, 9-10th-paragraphs ).

↑ This latest 32-ion-qubit quantum computer's error rate is too bad (= 40% or fidelity is 0.6 ) in some random-number generating operations compared to the error-free practical classical computer ( this p.9-Fig.8 ).

As a result, the current hopeless situation of impossibility of practical quantum computer is unchanged (forever).

Quantum computer is deadend, still having only one impractical ion qubit (= Not a computer at all ).

The 1st, 4th, 10th, last paragraph of this hyped news say

"Researchers at ETH have managed to trap ions using static electric and magnetic fields and to perform quantum operations on them. In the future (= just speculation, still useless ), such traps could be used to realize quantum computers with far more quantum bits"

"In this way, it has been possible in recent years to build quantum computers with ion traps containing around 30 qubits (= far less than millions of qubits required for useful quantum computer ). Much larger quantum computers, however, cannot straightforwardly be realized with this technique (= deadend ). The oscillating fields make it difficult to combine several such traps on a single chip,"

" A single trapped ion (= only one ion qubit, which is Not a computer at all ), which can stay in the trap for several days, could now be moved arbitrarily on the chip, connecting points "as the crow flies" by controlling the different electrodes"

"As a next step, Home wants to trap two ions (= meaning this research just trapped only one ion qubit = Not a computer at all ) in neighboring Penning traps on the same chip.. This would (= just speculation ) be the definitive proof that quantum computers can be realized using ions in Penning traps (← false. Several ion qubits alone cannot prove quantum computers )"

Trapping only one ion (= only one bit ) cannot make a quantum computer that will need millions of qubits, and their research is already deadend with No progress.

↑ This research just trapped only one Be+ ion (= whose two energy levels were used as a bit or qubit's states 0 or 1 ), and moved it a little by controlling external electromagnetic field in Penning traps, and No quantum computer was realized, contrary to the overhyped headline "larger quantum computers".

Ordinary ion quantum computer (= still Not a computer ) can Not have or confine more than 32 ion qubits to one chip or one trap area, hence, they have to inconveniently move each ion to another Penning trap (= impractical, taking much time ) to realize interactions between more ions ( this-13-14th-paragraphs,  this p.1-left-1st-paragraph,  this p.1-left ).

↑ Even this latest research could trap only one ion (= just only one quantum bit or qubit 0 or 1 ) in very large bulky trap chip (= one qubit size is as large as 100μm, ~ 1mm far bigger and bulkier than the ordinary classical computer's compact bit or transistor of only 40nm size ), and move it slightly (= about 50μm ) slowly (= each motion took 4 milliseconds ) at cryogenic temperature (= 6.5K = impractical ) with No quantum computer's calculation.

Practical quantum computer is said to need more than millions of qubits ( this-1st-paragraph,  this-2nd-paragraph,  this-3rd-paragraph ), so the current ion-trap quantum computer, which can Not have more than 30 qubits, is completely hopeless like all other quantum computers.

This research paper ↓

p.1-right-last-paragraph says "a single beryllium Be ion confined in electric trap (= only one impractical ion qubit )"
p.2-Fig.1 shows one ion qubit is very big and bulky (= more than 100μm )
p.4-left-2nd-paragraph says "an ion was transported (= about 50μm ) in 4ms"
p.6-left-lower says "temperature of 6.5 K."

↑ Despite longtime researches across the world, practical quantum computers that will need millions of qubits are still a long way off (= which means unrealized forever ), and their quantum computer researches are regressing to only one (impractical) qubit instead of progressing to larger numbers of qubits with less errors.

 

Spin qubits are deadend.

Spin-qubit silicon-type quantum computer is extremely impractical, unstable, impossible to scale up.

Silicon-type quantum computers allegedly using a tiny electron spin dot as a qubit is far more impractical than other-type qubits.  ← They only achieved highly-impractical 12-qubits, which fall far short of the practical quantum computer requiring millions of qubits ( this-lower Disagreement with Intel,  this 10th-paragraph ).

↑ This silicon-type quantum computer (= still Not a computer at all ) just measures the atomic energy state or magnetic field ( this Fig.4 ) interacting with light ( this Fig.4 ), they can Not directly measure the (fictional) electron's spin itself.

Even the latest published spin-qubit quantum computer's research used only six electron qubits (= correctly, only two qubits consisting of six quantum dots,  this 4th-last paragraphs,  this Fig.2 ), which is far from the practical quantum computer needing more than million qubits ( this 2nd-last-paragraph,  this 3rd-paragraph ).

↑ And their error rate of two-qubit CNOT gate is impractically high = 3.7 % (= 3.7 × 10-2 = 100 % - fidelity 96.3 %,  this-abstract, p.4-Fig.4,  this 3~5th-paragraphs ), which is also far from the practically-required error rate of 10-15 ( this 3rd-paragraph,  this last-paragraph ).

The latest single-electron charge qubit ( = each electron's motion pattern is used as quantum bit's state 0 or 1 ) also has impractically high error rate (= single qubit's read out error is about 2% = 100-98% fidelity, this p.2-left ), and they did Not state the two-qubit error rates because the stable two-qubit gate operation is impossible.

Neutral atomic qubits are hopeless.

Cold atoms trapped in laser light or optical lattice are extremely unstable, easily lost and hard to manipulate, completely impractical.

Some groups such as Harvard try to exploit neutral atoms unstably trapped in laser light or optical lattice as quantum bits or qubits.

Each bit or qubit's 0 or 1 states are expressed by the neutral atomic two energy states or hyperfine states (= 0 or 1 depending on the atomic nuclear magnetic directions up or down with respect to electron's orbit ).

The problem of this atomic qubit is that physicists cannot control each atom precisely.

When they try to load atoms onto the designated optical lattice trap places, their success trapping rate is only 50% (= 50% of all the places remain empty,  this p.1-right,  this p.1-left-2nd-paragraph ).

And they cannot precisely control the places where atoms are loaded.  Bad survival rate ( this p.2-Fig.3B, p.3-Fig.4D ).

Fragile neutral atoms have to be kept at almost absolute zero temperature, which is impractical ( this-last-paragraph,  this p.6-left-1st-paragraph-10μK,  this p.7-1-2nd-paragraphs-1mK ).

Even at such an extremely low temperature, atoms frequently drop out of the laser light optical lattice and disappear within 10 seconds ( this-middle-Components in detail,  this p.8- 4-5th-paragraphs ), which cannot be used as stable memory bits.

When reading each atomic bit state, physicists illuminate atoms with laser light with some wavelengths that can distinguish two hyperfine energy states, so that only one of two states is excited to the higher level, and emits the detectable fluorescence light.

↑ Even in this process of reading each atomic state, atoms hit by the probe light are often removed from the optical lattice ( this p.3-C. Atom Loss,  this p.9-right-2nd-paragraph ).

↑ There are a few cases of non-destructive detection methods, but they are limited to only less than 10 atoms or 10 qubits with high error rates ( this p.1-left, p.9-left ).

It takes impractically-much time (= milliseconds ~ seconds ) to reload new atoms to the optical trap, and each operating time of an atom is longer and slower than other qubits ( this 16th-paragraph,  this p.3-right-last-paragraph ).

In two qubit operation, physicists have to excite each atom to very high energy state or big atomic radius called Rydberg state which state is extremely short-lived ( this p.13-3~4th-paragraphs ) and impractical with high error rate or low fidelity ( this-6th-paragraph ).

Error correction of atomic qubits is also impossible.

 

Quantum computer's overhyped news

Atomic clock has nothing to do with ( error-prone ) quantum computers.

The 2nd, 7th, 8th, 9th paragraphs of this hyped news say

"Researchers recently explored the possibility (= just uncertain possibilities ) of using quantum computing techniques to further improve the performance of atomic clocks. Their paper, published in Nature Physics, introduces a new scheme that enables the simultaneous use of multiple atomic clocks to keep time with even greater precision (= lie, this research's error rate is far worse than ordinary atomic clock, and this research has nothing do to with quantum computers )."

"The researchers used their proposed technique to control individual atoms in atomic clocks (= unlike the practical atomic clocks treating many mixed atoms simultaneously, controlling individual atoms in this research caused impractically-worse errors ). Specifically, they ensured that each atom effectively experienced the passing of time slower or faster, depending on their dynamic position in relation to the applied laser beam."

"The current most precise clocks in the world work by measuring the passage of time with a large ensemble of atoms, but we demonstrate that individual control could lead to better performance (← untrue, controlling an individual atom or an unstable atomic qubit shows a far larger error rate ) More generally, our work shows the power of combining the features of quantum computers and quantum sensors (← untrue, this research has nothing to do with quantum computer nor sensors )"

"The initial findings.. are very encouraging, highlighting the potential (= just uncertain potential, still useless ) of quantum computing techniques in metrology research. In the future (= just speculation ), this study could inspire the development of other programmable quantum optical clocks"

↑ This research just used only two independent Sr atoms (= Not a quantum computer that will need a lot of atoms or qubits ) interacting with laser light, and measured the atomic energy levels' change with error rate far worse than the ordinary atomic clock, so useless research.

In the ordinary atomic clock, the wavelength or frequency (= energy ) of the mirowave is adjusted to agree with the atomic two energy levels' interval by illuminating a collection of atoms with various microwaves.

↑ The error rate of manipulating an atom in this research is far higher and worse (= fidelity = 1 - error rate is 0.984 = error rate is 0.016,  this p.3-left-last = average fidelity is 0.984 in this research ) than the ordinary atomic clock error rate of only 10-17 ~ 10-18.

↑ So this research is useless not only for (error-prone) quantum computer but also for atomic clock.

Quantum computers must manipulate each atom, which causes extremely large error rate inapplicable to precise atomic clocks (= so this news claiming such error-prone quantum computers might make time more precise is a total lie ).

To use each single atom or an ion for atomic clock, they have to usually take extremely much time = about 10 days ~ months ( this future plan,  this p.3-abstract ) at extremely-low temperature to estimate the precise average atomic energy frequency or time, which is completely impractical.

Actually, this research paper's p.1-abstract-last just says
"Our results demonstrate the potential (= just potential, still unrealized ) of fully programmable quantum optical clocks (= No mention of quantum computer nor practical atomic clock ) even without entanglement (= No entanglement means this research did No two-qubit operation required for computer's calculation ) and could be combined with metrologically useful entangled states in the future (= just speculate uncertain future )."

The world's largest 1000-atomic-qubit quantum computer is meaningless and useless.

Quantum computers are still error-prone and far from practical computer that will need millions of qubits.

The current world's largest quantum computer is said to be Atom Computing company's 1180 atomic qubits or IBM's 1121-superconducting qubits, but neither of which disclosed the detailed performance of the dubious machines ( this 1-3rd,12th-paragraphs  = D-Wave is Not a real quantum computer, so excluded ) due to their still-impractical error-prone quantum computers.

This 7~10th paragraphs say
"For quantum computers to dethrone today's best classical computers, however, scientists would still need a quantum processor with millions of qubits (= 1000 qubits can do nothing )."

"And that's a problem, because qubits are notoriously error-prone and need to be kept at near absolute zero (= energy-inefficient, impractical ).. One in 1,000,000,000,000,000,000 (billion billion) bits in conventional computers fails, while the failure rate in quantum computers is closer to 1 in 1,000 (= quantum computer is far more error-prone than ordinary classical computer )."

↑ So the current largest quantum computer (= still Not a computer nor calculator at all ) is far from the practical quantum computer that will need millions of qubits and far less error rates.

Quantum computers are said to calculate faster by exploiting ( still-unproven, fictional ) parallel-universe (= or superposition ) simultaneous computation.

But the recent research showed ordinary classical computer could calculate much faster and more accurately than the error-prone (impractical) quantum computer ( this 6th-paragraph ) even if classical computer was unfairly forced to emulate (still-unproven) quantum mechanical superposition or parallel worlds.

The latest largest 1305-atomic qubit quantum computer is still error-prone and useless.

The latest news claims researchers at TU Darmstadt have first created the world's largest 1305-atomic-qubit quantum computer whose properties were published in the peer-reviewed journal (= Actually, their largest quantum computer is still Not a computer, because they could calculate nothing by using it in this research ).

The 2nd, 4th, 7th paragraphs of the hyped news about this alleged world's largest quantum computer say

"Quantum processors based on two-dimensional arrays of optical tweezers, which are created using focused laser beams, are one of the most promising technologies for developing quantum computing and simulation that will (= uncertain future ) enable highly beneficial applications in the future (= just speculation, still useless quantum computer )."

"the team reports on the world's first successful experiment to realize a quantum-processing architecture that contains more than 1,000 atomic qubits (= still far from millions of qubits required for practical computer ) in one single plane."

"A total of 1,305 single-atom qubits were loaded in a quantum array with 3,000 trap sites and reassembled into defect-free target structures with up to 441 qubits."  ← So actually, they created only 441 atomic qubits (= Not 1000 qubits ) that were placed in the "designated" positions (= even arranging atoms or qubits in desired positions is still impossible, so useless ).

↑ In the atomic-qubit quantum computer, they try to trap multiple cold neutral atoms (= each atomic two energy levels are used as a bit's 0 or 1 states ) in laser light or optical lattice tweezer.

↑ The most serious problem which makes atomic-qubit quantum computer impractical is that those atoms easily disappear and drop out of (unstable) laser optical lattice trap ( this technical challenge ).

This research paper (= this p.3-left-results, p.4-left-1st-paragraph ) says
"an average loading efficiency of approximately 40% (= only 40% of atoms are trapped or loaded in laser light as qubits, and 60% atoms are lost )"

"With supercharging, the cumulative success probability increases to a value of 35% (= success rate of putting atoms onto right places is only 35%, and 65% error rate !  this-upper-figure-(a)-red line ), still not exhibiting saturation at the maximum implemented number of 50 assembly cycles"

↑ This research on the alleged world's largest atomic-qubit quantum computer (= still impractical, and erorr-prone ) did Not perform any calculations nor qubit-operation.

According to the recent unpublished Atom Computing's paper, they had to continuously load new atoms onto optical lattice at time intervals of 300ms (= the frequently-lost atomic qubits cause errors and cannot be used as practical memories ), because atoms or qubits constantly disappear with lifetime of less than only 30 seconds ( this p.6-right-Appendix B,  this p.9-2nd-paragraph ).

↑ In spite of this frequent loading of new atoms, still 1% atoms remained lost in only 1225 atomic qubits (= far from the practical millions of qubits, this 1st, 4th-paragraphs,  ← Atom Computing's paper does not show success probability or error rates of loading atoms ).

↑ Atom Computing's quantum computer also did Not calculate anything nor operate qubits due to focusing all of their time and energy only on atomic qubit loss ( this-abstract ).

As a result, quantum computers are already dead and hopeless with No progress, contrary to an incredible amount of overhyped news.

Hyped topological quantum computer with fictional anyon quasiparticle  in 2024.

Research on topological quantum computer with fictional non-Abelian anyon quasiparticle, which was allegedly less-error-prone (= wrong, actually very error-prone ), appearing in arxiv preprint a year ago was published in Nature recently (= 2/14/2024 ).

First of all, this topological quantum computer is just fiction and scientificially meaningless except for publishing papers in academic journals.

In topological quantum computer, fictional quasiparticles such as anyon (= Not real particle, this 3rd-paragraph ) and Majorana with fractional charge are said to entangle with each other by using fictional unseen braids (= Not physical string, this p.3-middle-2nd-paragraph ), and this nonphysical tied braids of anyon quasiparticles are said to make topological quantum computer robust or less error-prone (= only inside unproven imaginary mathematical theory ).

There is No evidence that such (fictional) anyon quasiparticles or magical braids (= Not real particle or braid, but just nonphysical math thing, this 10th-paragraph ) may exist.

This recent alleged non-Abelian topological quantum computer published in Nature just used 27 trapped ion qubits (= still far from millions of qubits required for practical quantum computer ), which are Not the authentic (fictional) anyon quasiparticles, and they are still error-prone (= due to No real magical braids ) and useless.

This 9th-paragraph says
"However, some researchers claim that Quantinuum has Not actually created non-Abelian anyons. They argue that the firm has instead merely simulated them ( this 2nd-last-paragraph )."  ← They tried to pretend that 27 ions ( this p.2-left-2nd-paragraph, p.4 ) were fictional anyon quasiparticles.

This last-paragraph says
"All in all, there are still questions on whether the "credible path to fault tolerant quantum computing" was unlocked ( this 10-13th-paragraphs )"  ← still No evidence of less errors

This 27-ion-qubit quantum computer (= still Not a computer ) pretending to be (fictional) braided anyon quasiparticles ( this p.27-30 ) still has very bad and high error rates of 35% (= 1.6% error rate per single qubit ) or fidelity (= 1 - error rate ) of 0.65 (= 0.984 per qubit, this p.12-A14,A15 ), which is far worse and higher than the error rate of 10-15 that is needed to be useful quantum computer.

Quantum computer is deadend, regressing to only one useless qubit, which is far from the practical millions of qubits.

The 3rd, 4th, 13th, 14th paragraphs of this hyped news say

"In a paper published in Nature Communications, the engineers describe how they used the 16 quantum 'states' of an antimony (= Sb ) atom to encode quantum information (= actually only two energy levels out of 16 splitting energy states of one Sb atom could be used as only one qubit's 0 and 1 in this research, this p.7-Gate fidelities,  which is still Not a computer )."

"Antimony is a heavy atom that can be implanted in a silicon chip, replacing one of the existing silicon atoms. It was chosen because its nucleus possesses eight distinct quantum states, plus an electron with two quantum states, resulting in a total of 8 x 2 = 16 quantum states, all within just one atom (= only one qubit ). Reaching the same number of states using simple quantum bits—or qubits (= this experiment used only one Sb atom or one qubit, which is Not a quantum computer that will need millions of qubits )"

"The quantum computers of the future will (= just speculation, still useless ) have millions, if not billions of qubits (= but this experiment used just one impractical atomic qubit ) working simultaneously to crunch numbers and simulate models in minutes that would (= just baseless speculation ) take today's supercomputers hundreds or even thousands of years to complete (= which overhyped description contradicts the latest result proving classical computer outperforming quantum computer )"

"While some teams around the world have made progress with large numbers of qubits, such as Google's 70 qubit model or IBM's version which has more than 1000, they require much larger spaces for their qubits to work without interfering with one another"

↑ The current leading superconducting qubit is inconveniently big (= one bit is millimeter ), bulky and energy-inefficient (= they have to be cooled to almost absolute zero ), which cannot be scaled up to millions of qubits required for the practical quantum computer ( this 3~4th-paragraphs ).

↑ This research tried to use "only one single Sb atom" as one qubit (= using two energy levels of a Sb atom as a bit's states 0 and 1,  this p.7-left-gate-fidelities-only one-qubit gate ) with still high-error rate.  ← Using only one bit or qubit (= even two qubit operation has Not been done in this research ) is Not a computer at all.

Still only-one qubit, high error rate, Not a computer at all.

The error rate of one qubit operation in this research is still impractically high = one qubit's error rate is 1 ~ 10% ( this p.17-Table S2,S3 ), which is far higher and worse than the practically-required error rate of less than 10-15.

To explain the observed multiple energy levels of a Sb atom under electromagnetic field, they just fitted free parameters to experiments ( this p.11-S11, p.12-S15 ) with No quantum mechanical prediction nor calculation.

The practical quantum computer is said to need more than millions of qubits ( this 7~8th-paragraphs ).
But even this latest research in 2024 could make only one unstable (atomic) qubit with impractically-high error rate.

So quantum computer research has been already deadend with No progress, rather, they are regressing to only one single useless qubit (= Sadly, publishing papers in journals instead of aiming to invent useful machines is the current research's only purpose ).

 

Quantum network, internet is hoax

(Fig.C)  Quantum cryptography uses very weak classical light as fictitious photon's information which is easily lost and impractical forever.

See Quantum network is hoax.

Quantum information, internet hype

Quntum key distribution is impractical

The 3rd paragraph of this hyped news says
She investigated how the visibility of the so-called Hong-Ou-Mandel effect, a quantum interference effect (= wrong. This can be explained by classical light interference + photodetector that just detects electrons ejected by classical light, Not fictitious photon particles ), is affected by multiphoton contamination."

And the 2nd-last and last paragraphs of the same news say
"The findings could (= just speculation ) be important for quantum key distribution, which is necessary for secure communications in the future (= uncertain future )
However, many questions remain unanswered, Little research has been done into multiphoton effects, so a lot of work is still needed (= still useless quantum key distribution )"

Quantum memory is impractical forever.

Physicists try to develop "quantum memory (= usually some atomic energy levels are used for storing light energy or information )" that can allegedly store information of light or photon transiently.

But even in the latest research in 2023, the storing efficiency of this quantum memory is only 60% (= 40% light or photon information is lost, each time they try to store information in this impractical quantum memory ), which is deadend, unable to make useful quantum internet.

But the 2nd and last paragraphs of this hyped news say
"So far, developing these high-dimensional memories has proved challenging, and most attempts have not yielded satisfactory efficiencies"

"In the future, we will (= uncertain future, still useless ) establish high-dimensional quantum repeaters using high-dimensional quantum memories, enabling high-dimensional quantum communication between two or more remote quantum nodes (← ? )"

Quantum internet, network is joke, impractical forever, contrary to the media-hype.

The 10th, 13th, 14th paragraphs of this recent hyped news say

"A spinning electron (= wrong, an electron's spin is Not real spinning ), or electronic qubit, is very good at interacting with the environment (= wrong, there is still No practical qubit ), while the spinning nucleus of an atom, or nuclear qubit, is not. We've combined a qubit that is well known for interacting easily with light with a qubit that is well known for being very isolated, and thus preserving information for a long time (= this is wrong, too )"

"In general, then, light carries information through an optical fiber to the new device, which includes a stack of several tiny diamond waveguides that are each about 1,000 time smaller than a human hair. Several devices, then, could act as the nodes that control the flow of information in the quantum internet (= wrong, still far from internet )."

"The work described in Nature Photonics involves experiments with (only) One (tiny) device (= Not internet nor network connecting multiple devices at all ). Eventually, however, there could (= just speculation, still useless ) be hundreds or thousands of these on a microchip (= just imagination )"

↑ This research tried to use two atomic energy states or levels (= Not spin itself ) of a tin inside a tiny diamond as (quantum) bits 0 and 1 (= fictitious electron spin itself is unseen, they just measured two energy states interacting with lights, No quantum mechanical prediction nor calculation using spin was done in this research, just exerimental energy values were used, this p.3-left ).

Realization of practical quantum internet or network is impossible forever due to the severe loss of light or photon whose polarization or phase is used as bit information.

Even this latest research (= p.1-left-last~right ) admits
"However, spectral instability and low-rate coherent-photon emission remains an obstacle for scaling up quantum networks (= still impractical )."

First of all, the tiny device allegedly used as qubits in this research must be cooled to extremely low temperature = only 0.4 K (= almost absolute zero ! ), which is impractical ( this p.7-left-2nd-paragraph ).

They reported about 98% of light or photon entering this tiny device was lost, which means light or photon detection efficiency (after interacting with this tiny device) even in this latest research was only 1.7 % (= more than 98% error rate ), as shown in this p.5-6-Table S1

Probability of detecting two photons (= two weak lights ) is miserably low = only 5.0 × 10-5 = 0.00005 (= due to severe photon loss ) as shown in this p.4-left-2nd-paragraph and this Fig.3c.

As a result, quantum internet, quantum network and quantum computer trying to use fragile photon or very weak light as information messenger in vain are impractical forever, because photon detection efficiency or severe photon loss have Not been improved at all despite extremely long-time researches across the world.

Quantum internet is still joke, impractical forever.

The 2nd, 4th, 7th paragraphs of this hyped news say

"A team.. have taken a significant step toward (= meaning "still unrealized" ) the building of a quantum internet testbed by demonstrating a foundational quantum network measurement that employs room-temperature quantum memories. "

"While the vision of a quantum internet system is growing and the field has seen a surge in interest from researchers and the public at large, accompanied by a steep increase in the capital invested, an actual quantum internet prototype has Not been built (= even the prototype of quantum internet has Not been built )"

"They tested how identical these memories (= this quantum memory just means Rb atomic two energy levels' transition storing the incident weak light or photon energy, this p.6 quantum memories, p.2-results ) are in their functionality by sending identical quantum states (= just using weak classical polarized light or photon as quantum information carrier ) into each of the memories and performing a process called Hong-Ou-Mandel (= HOM ) Interference (= which is just interference of two classical lights after passing beam splitters ) on the outputs from the memories, a standard test to quantify the indistinguishability of photon properties."

↑ This research just sent two very weak (classical) light called (fictitious) photons through rubidium (= Rb ) atomic vapor (= each Rb atomic energy levels can store the incident light's energy transiently = only microsecond ) called quantum memory, and got two retrieved lights (= significant photon loss, so useless quantum memory ) interfering with each other at beam splitters.

This p.3-right-4th-paragraph says
"a mean photon number per pulse of ~1.6 serve as the inputs to the memories in a single-rail configuration. After storage and retrieval from each memory, we measure the mean signal photon number to be ~0.01 (= photon number drastically decreases from 1.6 to only 0.01 after stored in the impractical quantum memory ) at the inputs of the beam splitter"

According to this p.3-Table 1, the imput photons (= which is just weak classical light whose light intensity exceeds photodetector's threshold ) suffered significant loss while traveling and getting through quantum memory, and the photon number drastically decreased from 1.6 to only 0.00065 (= photon's detection efficiency is only 0.00065/1.6 = 0.0004 meaning 99.96% photons were lost ), which quantum memory or internet is completely impractical.

This paper p.6-left-3rd-paragraph also says
"several hurdles still exist toward using warm atomic vapor QMs (= quantum memories ) in memory-assisted measurements and operations. As evidenced in this study, mitigating the noise associated with these memories stands as a paramount challenge to achieve indistinguishable retrieved qubits."

As a result, the quantum internet or network is already deadend and impractical (forever) despite an incredible amount of time and money wasted for this (fruitless) research across the world.
Only unfounded hypes misleading laypersons remain.

 

to

2023/11/21 updated. Feel free to link to this site.