Quantum computer is hopeless, useless forever,  only hypes dominate news.

Top

Contents ↓

Quantum computer research has been already deadend, hopeless with No progress or No supremacy due to their unrealistically-high error rates and bogus parallel-world-calculation assumption.

Quantum computer is already dead, useless forever. Only hypes remain.

(Fig.Q) Quantum computers that can Not give right answers due to their small numbers of qubits and large numbers of errors are far inferior to ordinary practical classical computers.

The useless dead-end mainstream physics or quantum mechanics needs fake scientific target such as quantum computers allegedly utilizing fantasy parallel worlds or superposition for (illusory) faster parallel calculations, and quantum information.

Contrary to the media-hype, all the researches on these dubious quantum computer, quantum information, and cryptography have been already deadend with No progress, No hope ( this 1st-sentence ) and No advantage, which is why many companies and academia need to spread a lot of misleadingly-exaggerated news every day to make it appear to be progressing ( this 6th-paragraph ).

Ordinary practical classical computer has more than billions of bits (= one bit gives 0 or 1 value ) or transistors with almost No errors (= classical computer's error rate is extremely low, less than 10-17 or only one error in 1017 bit operations ).

The current quantum computer is impractical with less than only 500 qubits (= corresponding to classical computer's bits ) and extremely high error rates (= 10-2 ~ 10-3 or one error in 100 qubit opperations ~ ), which is far inferior to the ordinary classical quantum computer ( this p.1-intro-2nd-paragraph,  this 2nd-paragraph ).

The future practical quantum computer is said to require more than millions of qubits ( this 11th-paragraph,  this 19-~21th-paragraphs,  this 5-6th-paragraphs ) and less than 10-15 error rates (= or fidelity = 1 - error-rate = more than 0.999999999999999 is needed for practical quantum computer,  this 5th-paragraph  this-abstract,  this-5th-paragraph,  this 6-7th-paragraphs,   this last-paragraph ), which is impossible to realize forever.

Because researches on quantum computers have been already deadend with No progress for a long time, and their dreamlike faster quantum computer's mechanism is based on impossible fictional parallel-world calculations.

In (fake) quantum supremacy experiment, Google's final accumulated error or noise rates increase to as high as 99.8% ( this p.3(or p.2)-1st-paragraph ), which is unable to give right answers, and why the current impractical quantum computers are used only for producing meaningless random numbers ( this 9-11th-paragraphs ).

↑ Random numbers remain random numbers, even if quantum computer gives a lot of errors (= the present quantum computers can Not give right answers due to their extremely high error rates ), which means they try to hide quantum computer's errors into random meaningless numbers, which error-prone random meaningless numbers can Not prove quantum supremacy. .

If physicists try to correct errors, their error rates tend to get worse and increase instead of decreasing, which can Not be called "error correction" at all ( = Google quantum computer error rate increased to about 3.0% or 0.03 from the original 1.0% or 0.01 error rate per error correction operation,  this 3rd-last-paragraph, which means their error correction operation can Not correct errors at all ).

Hype news abounds in impractical quantum computers

Overhyped news is necessary to hide the inconvenient fact that quantum computers are deadend and useless forever.
Companies only want subsidies, researchers only aim to publish papers in top journals in vain for unrealistic hopeless computers.

It's impossible to realize practical quantum computers that will need millions of qubits and must reduce error rates of the current error-prone impractical quantum computers by a factor of 1015.

In order to hide this inconvenient truth that quantum computer's research has been already deadend with No progress, many companies and academia colluding with the media and journals continue to create and spread the misleadingly overhyped harmful news that clearly hampers the progress of really useful science such as curing diseases.

For example, this news is full of misleading hypes like

"Scientists have come a step closer (= still unrealized ) to making multi-tasking 'quantum' computers"

"team has made a breakthrough, published in the journal Nature Communications, which may (= just baseless speculation ) have overcome that obstacle."

The 2nd paragraph of this news is also hyping like,
"a single quantum computer would (= just baseless speculation ) be more powerful than all the supercomputers, posing seemingly limitless potential (= just uncertain future )"

↑ The use of words such as "step closer", "may", "could", "would", "future", "promising" means their quantum computers are still unrealized and useless now.

Actually, contrary to a lot of overhyped news about (imaginary) quantum supremacy or advantage, quantum computers are already deadend, far from practical use ( this last-paragraph,  this 9th-paragraph ).

This 16th paragraph uses vague mild expression implying hopeless quantum computers' future,
"achieving quantum advantage would not necessarily lead to commercial viability (= meaning 'useless' ) and quantum computers may (= uncertain speculation ) only outperform classical computers in a very narrow set of tasks with limited practical implications."

Mislead by the current harmful overhyped news spread by the media, corporations and academic organizations that want taxpayers' money, students becoming researchers will just end up wasting their time only to aim at publishing papers, which are going nowhere, as slaves to (top) journals, eventually ruining their careers and future by these useless pseudo-science.

"Over-ambitious roadmap" for uncertain speculative future is also one of hypes used in many companies, ex. QuEra-Harvard team's atomic qubits has No ability to correct errors even in a small number of qubits, instead, they just discarded qubits that showed errors, and even after more than 99% of their qubits were discarded due to errors, their final error rate was miserably high = 90% (= fidelity is only 0.1 ), which is completely impractical and hopeless. = their so-called roadmap is nothing but baseless hype.

D-Wave annealing machines allegedly used for logistics or optimization problem are fake quantum computers with No quantum speed-up nor utility.

Quantum annealing machines for optimization problems trying to finding the lowest energy can Not reach right answers, so useless.

(Fig.D)  D-Wave quantum annealing machines are slower and worse than classical computers with good algorithm in optimization problems, so No quantum computer's advantage.  Only quantum hypes remain.

Quantum computers are still useless ( this middle ), so D-Wave machines allegedly used for optimization or traveling salesman problems ( this abstract ) such as logistics, scheduling, transportation are fake quantum computers called annealers that are also impractical with No advantage over ordinary classical computers ( this 7-8th-paragraphs,  this p.2-right-1st-paragraph,  this p.2-second-paragraph,  this 2nd-last-paragraph,   this p.17-second-paragraph,  this p.1-intro-1st-paragraph ).

↑ Their fake quantum speed-up came from unfairly choosing very bad time-consuming method as (fake) classical algorithm ( this 6th-last-paragraph,  this 2-3th-paragraphs,  this 8th-last~3rd-last paragraphs,  this abstract ).

↑ These very bad time-consuming classical optimization methods deliberately chosen to show (fake) quantum speed-up are called "(classical) simulated annealing" or "Monte-Carlo" in D-Wave and Harvard cases ( this-abstract-lower ).
These bad classical methods try to find the answer of the lowest-energy value in the impractical optimization problem called Ising model by almost random and disorderly searching for the lowest energy state taking too much time.

They deliberately chose Not to use the faster and better classical optimization method such as Selby's algorithm (= this good classical Selby's method outperformed D-Wave quantum annealing ) for claiming (illusory) quantum computer's speed-up ( this 6~7th-paragraphs,  this 8, 15, 19th-paragraphs,  this p.6-left-C ).

↑ Quantum annealing of D-Wave is known to often give wrong answers stuck in local energy minima (= Not the global lowest energy nor right answer ).

For example, in some tasks ( random instances with full range within granularity or rfr ), D-Wave gave zero right answers in 20 instances after running for several minutes ( this p.11-7.2 ), while the good classical Selby's algorithm ( this 2-3th paragraphs ) could give perfectly-right answers in all 20 instances within 30 seconds (= this p.15,p.16-Table2 shows Selby-best-30 or 30seconds gave better performance = more right answers than D-Wave or DW-best like in rfr, 20 right answers in classical Selby vs. 0 right answer in D-Wave ).  ← Classical computer is far better than quantum annealing.

This is why they are often hyping their still-impractical quantum computer ( this 2nd-last paragraph ).

The recent Science paper claimed some quantum computer-enhanced algorithm might outperform one of (very bad slower) classical methods called "greedy algorithm" which is much worse and more inefficient than even the upper slow bad classical simulated annealing ( this or this p.16,  this p.2-left-last-paragraph ), hence, this paper did Not show quantum computer speed-up over the faster good classical method, either.

↑ Actually, this Science paper dubious quantum computer-enhanced (= QE) greedy optimization method (= actually this research used a deceptive hybrid computer combining classical computer and the small impractical error-prone 72-qubit quantum computer ) was slower than even the upper bad classical simulated annealing and classical tensor-network ( this p.4-right, p.5-Fig.3-classical-red is better than quantum-green in performance,  this p.14 ).

No quantum computer's speed-up in optimization problems (= logistics, scheduling, traveling salesman problem.. )

Just hypes remain.

D-Wave or quantum annealing recent hyped news

D-Wave finally gives up quantum supremacy, and focuses on impractical quantum model as slave to academic journal ?

The 2nd and 3rd paragraphs of this hyped news say

"Using a D-Wave quantum annealing platform, the team found that fluctuations can lower the total energy of the interacting magnetic moments, an understanding that may (= just speculation, still useless ) help to reduce the cost of quantum processing in devices."

"In this research, rather than focusing on the pursuit of superior quantum computer performance over classical counterparts (= D-Wave finally stops spreading quantum speed-up hype ? ), we aimed at exploiting a dense network of interconnected qubits to observe and understand quantum behavior (= ambiguous quantum something )"

↑ This research tried to treat D-Wave superconducting qubits (= just classical circuits ), which are just fake artificial atoms (= Not real atom nor spin ) , as fictitious (anti)ferromagnetic system to explain some meaningless (thermal) fluctuation with No quantum mechanical speed-up nor quantum utility.

This p.3 used nonphysical abstract Ising model with No real particle picture as fictitious antiferromagnetic model, where D-Wave's transverse (classical) magnetic field fluctuating each superconducting qubit's magnetic direction is treated as ambiguous quantum fluctuation ( this p.7 ).

This p.4-left-lower says
"With discrete variations of the (magnetic) coupling strength (= between D-Wave's artificial atoms, this p.5 ) the system’s effective (= fake ) temperature is mimicked"  ← Not real atoms.

↑ Using fictitious artificial atoms to simulate some meaningless small random fluctuation in imaginary antiferromagnet is useless except for publishing papers in academic journals.

Quantum computers can never give right answers due to their unrealistically high error rates, so they can never surpass classical computers.  ← No quantum supremacy

All the present quantum computers (= still Not computers or calculators at all ) always give wrong answers due to their extremely high error rates ( this p.1-introduction-2nd-paragraph ), and quantum error correction is impossible.

Ordinary practical classical computers can always give right answers with No errors, as shown in your very precise and reliable laptop computers, so there is No quantum computer's supremacy or advantage due to its unrealistically high quanum error rates ( this p.1-right,  this p.1-right-2nd-paragraph ).

Quantum computer's supremacy is fake, useless.

Quantum computers' supremacy and advantage are illusion just outputting random meaningless numbers with No utility (= No evidence of quantum superposition due to their inability to remove errors ).

(Fig.S)  Quantum computer just generating random meaningless numbers cannot prove supremacy at all.

Quantum computers show only errors.  ← No quantum advantage.

The hopelessly-high error rataes of quantum computers are why some physicists try to create fake quantum supremacy (= that was disproven later ) or fake advantage by letting still-useless quantum computers output random meaningless numbers or detect random-number photons (= called boson sampling,  this p.1-right-2nd-paragraph ) into which physicists try to hide many errors caused by their hopelessly-impractical error-prone quantum computers ( this-last-paragraph,  this-1st, last-paragraphs ).

This p.3(or p.2)-upper says
"But in the NISQ era we do Not have the quantum resources to implement error correction and so experiments are highly noisy. For example, Google’s recent quantum supremacy experiment estimated that their fidelity (= accurate rate ) was merely ∼ 0.2% (i.e., the experiment was ∼ 99.8% noise = 99.8 % error rate ! ) and that their fidelity will further decrease as they scale their system. Therefore their quantum supremacy claim hinges on whether or not random circuit sampling is intractable in the high noise (= error ) regime, in which there is only a small signal of the correct experiment in a sea of noise, and this signal diminishes with system size"

↑ Google quantum computer's error or noise rate is extremely high = 99.8% (= only 0.2% accuracy or fidelity ) in the random sampling claiming dubious supremacy which means their random meaningless numbers obtained by Google quantum computer are just results of errors (or noise ) irrelevant to quantum mechanical calculation.  ← No evidence of quantum mechanical calculation's speed-up nor supremacy

The present quantum computers always giving wrong answers cannot prove quantum supremacy based on (fictional) superpositon or parallel-world simultaneous computations.

↑ The fact that the present (useless) quantum computers can Not eliminate their errors (= noise ) means all calculated results given by the error-prone quantum computers are Not right answers nor true quantum mechanical results, whether they are random meaningless or non-random numbers.  ← it's impossible to prove that the (unseen fictionally-faster) quantum mechanical superposition or 253 parallel-universe dream-like calculations (= which are unobservable with No evidence ) allegedly inaccessible to classical computer really happen ( 53 of the imaginary 253 superposition states is Google quantum computer's bit number,  this 8~9th paragraphs,  this middle,  this 26-29th paragraphs ).

Google quantum computer Sycamore uses a (classical) superconducting circuit's electric current's two energy states as a quantum bit or qubit 0 or 1, and they wrongly think applying (classical) laser light or microwave pulse to qubits may realize ( imaginary unseen ) quantum parallel-world superposition states ( this-Where is it based on ?  this 4-6th-paragraphs,  this 7-8th-paragraphs,  this p.2-right,  this p.1-11,  this last-paragraph,  this p.13-14 ).

No evidence of quantum supremacy based on superpositon (= which is just the intermediate or classically mixed 0 and 1 state irrelevant to fictional quantum parallel worlds ), because qubits are supposed to converge to only one single state in only one world by measurement ( this 5-6th-paragraphs ).

↑ They falsely think each qubit may take 0 and 1 states, hence, their 53-qubit quantum computer could unrealistically take 253 states simultaneously using 253 superposition parallel worlds (← impossible ! ) by qubit's gate operation applying classical laser light or microwave pulses to qubits ( this 5-8th-paragraphs,  this 9-12th-paragraphs,  this 5-7th-paragraphs ) that may be hard to simulate by classical computer's single-world, which illogical explanation based on imaginary unseen quantum parallel-world superposition causes the false quantum surpemacy claim ( this-2nd-paragraph,  this 4th-paragraph ).

This (fictional) quantum superpositon has No evidence, because the quantum superpositions are unobservable ( this 4-5th-paragraphs ) and unidentifiable even indirectly, hidden in their random meaningless numbers (= random meaningless numbers tell us nothing about whether quantum calculations different from classical computers really occur ) and error-prone results (= Google supremacy random numbers are just results of 99.8% errors or noise with only 0.2% fidelity = Not accurate at all, so No evidence of quantum superposition powerful calculations.  this 9th-paragraph,  this p.16 ).

Ordinary practical classical computers can always obtain right answers with No errors and can give various random numbers more quickly, so there is No quantum computer's supremacy or utility, which is why all the quantum computers are still useless despite their (false) quantum supremacy claim ( this-last,  this last-paragraph,  this 3rd-last-paragraph ).

In other words, quantum computers are hopelessly error-prone and unable to correct their errors (= quantum computers are unable to give right useful non-random numbers ), so physicists have No choice but to rely on outputting random meaningless numbers ( this lower ) into which they try to sneakily hide the quantum computer's errors.  ← No quantum advantage.

Chinese and Canadian Xanadu's photon quantum computer's advanrage based on boson sampling is also fake, No speed-up.

Chinese and Canadian Xanadu claimed quantum advantage using photon quantum computers (= which are extremely error-prone and Not even computers ) based on useless task called boson (random) sampling that has No practical use ( this last paragraph ).

↑ This photon quantum computer's boson (random) sampling (= just randomly detecting lights or photons ) is based on wrong assumption of non-existent classical indivisible photons, so actually No quantum advantage.

Even if we suppose a photon could split into multiple superposition states ( this p.2-right-2nd-paragraph ) or parallel worlds (= which ridiculous idea is necessary for claiming quantum advantage ), classical computer outperformed the photon's quantum computer's (Gaussian) boson sampling (= BS or GBS ) after all, hence quantum advantage was officially invalidated ( this p.1-right-2nd-paragraph ).

The 2nd paragraph of this site says
"Specifically, our recent work shows that Gaussian boson sampling quantum supremacy experiments can in fact be simulated fairly quickly on a classical supercomputer, and the quality of the simulation is higher than the experiment under all thus far testable metrics"

Quantum supremacy, speed-up or advantage proved to be fake, just illusion.

Quantum computer can never give right answers because of errors.

Quantum computer's error correction is disastrous, hopeless, unable to correct their errors.  = impractical forever.

(Fig.S)  Quantum error correction is disastrous, impracitcal forever, increasing errors instead of decreasing errors !

Quantum computer is already dead due to its inability to correct errors.

Despite extremely longtime researches, quantum computers are still useless (= forever ) due to their inability to correct error (= or noise ) of their hopelessly-error-prone quantum computers ( this 6-9th-paragraphs ).

This challenge 2 says
"Still, achieving fault-tolerant quantum computation, where quantum computations can be performed reliably despite errors, remains a significant challenge."

To hide this inconvenient truth of already-dead quantum computer, the media, corporations and academia are constantly spreading overhyped fake news like this or this 3~5th paragraphs,

"The promise is certainly there, but so is the hype..
The hunt hasn’t gone particularly well, but that may not matter now. In the last couple of years, theoretical and experimental breakthroughs have enabled researchers to declare that the problem of noise might (= just baseless speculation, still useless now ) finally be on the ropes.. "

Quantum computer error rates are many orders of magnitude higher than practical classical computer, and No progress in (hopeless) quantum computer research has been made.

All the current quantum computers are unable to give right answers, because even the best quantum computer's error rate is extremely high = 10-2 ~ 10-4, which is far worse than the ordinary practical classical computer's error rate (= only less than 10-17,  this p.1-left-lower,  this 2nd-paragraph,  this 17th-paragraph,  this p.8-9 ).

For the quantum computer to be practical, physicists have to build a dream-like quantum computer consisting of millions of qubits ( this 4-6th-paragraphs ) with less than 10-15 error rate, which is impossible to realize forever ( this abstract,  this 5th-paragraph,  this 7th-paragraph,  this 5th-paragraph,  this p.11,  this-Error rates in quantum computing ).

"Error rate = 1% = 0.01" of two-qubit gate operation means when we perform 100 two-qubit gate operations, one operation causes error, which is impractical, and must be reduced to 10-15 error rate for practical use.
Fidelity = 1 - error rate ( this p.7-(15) ), so the error rate of 1% (= 0.01 ) equals the fidelity of 99% (= 0.99,  this-middle-two-qubit gate error ).

As they repeat each qubit's gate operation, the quantum computer's errors (= or noise ) accumulate, resulting in always giving wrong answers.  ← No quantum computer's advantage over the current practical ordinary error-less classical computer or laptop.

For example, the latest Google's quantum computer's error rate is still far higher (= each two-qubit operation's error rate is 6.7 × 10-3,  this p.5-left-2nd-paragraph ) than the practically-required level (= less than 10-15 ), and their accumulated error rate after outputting random meaningless numbers in their dubious supremacy experiment drastically increased to as many as 99.8% error rate (= only 0.2% fidelity or accuracy,   this p.3(or p.2)-upper,  this p.3-right,  this-middle two-qubit gate error ), which is just the result of errors, Not of quantum mechanical calculation !

If physicists try to detect and correct errors, their error-correction manipulation itself further increases and worsens their original error rate.  ← Quantum error-correction operation itself has an adverse effect with No correction.

Even in the latest Google's best error correction experiment, their resultant error rate increased to about 3.0% (= 3 × 10-2 ) from the original error rate (= 10-3 ) by the current devastating quantum error correction opperation ( this 3rd-last-paragraph,  this-3rd-last-paragraph,  this 2nd-paragraph ).

Furthermore, Google's quantum computers could just detect errors, and could Not correct their errors ( this 3rd-paragraph ), so useless.  ← No hope nor progress in quantum computer's error correction ( this p.2-left-3rd-paragraph ).

Also in other types of quantum computers such as ion qubits (used by Honeywell, Quantinuum) that are much slower, more impractical and impossible to scale up, each manipulation of quantum error correction (= QEC ) worsens and increases errors by 2.7 % or 2.7 × 10-2 (= the manipulation or error correction has adverse effect = accumulating and increaseing errors instead of reducing errors ! ) from the initial starting error (= SPAM = 1.7 × 10-3,  this p.4-right ).

In case of preparing a little complicated initial qubit state (= the upper deals with the simplest case ), the error rate easily exceeds 10 % (= fidelity is less than 0.9 ), and one operation of error correction just worsens or increases the error by 25~28% instead of decreasing errors ( this-9th-paragraph,  this p.4-right ).  ← completely impractical.

"Fault-tolerant" or "erasure" errors have nothing to do with error-correction, they just discard error-qubits instead of correcting them.

Fault-tolerant quantum computer is Not error-correction but just discarding qubits (= instead of correcting errors ) when they detect the qubit's error, where almost all qubits will disappear (= so unable to give right answers ) due to being "discarded" during calculating large numbers in error-prone quantum computers, which is impractical.

Dreamlike fault-tolerant quantum computers reducing or correcting errors are hopeless and far from reality ( this 8th-paragraph,  this-callenges in quantum computing,  this a fundamentally different technology,  this p.1-right-1st-paragraph ).

The point is the present hopeless error-prone quantum computers are unable to give right answers and correct errors (= the operation of error correction itself increases their original errors instead of decreasing errors ! ).

This is why physicists created the misleading concepts such as "fault-tolerant" (or "error-resistant", "toward error-free" that are Not error-free at all ) and "erase errors" that have nothing to do with error-corrections, so still error-prone and useless.

In this deceptive "fault-tolerant quantum computing", they just discard (= instead of correcting ) all qubits that show at least one error as post-selection ( this 6th-paragraph,  this-p.1-left-3rd-paragraph,  this p.6-right-last-paragraph ), which means almost all qubits would be discarded due to showing even slight errors (= so these removed qubits cannot be used for giving final answers ) when the current error-prone quantum computers try to calculate large numbers, which is impractial.

Don't be misled by hyped news like this research paper (= original paper does Not use the misleading word 'error free' ) using only less than 16 ion qubits that showed more than 2.5 % (= 2.5 × 102 ) ~ 20 % error rates or infidelities ( this p.2-right-last-paragraph, p.4-Fig.3c ), which is impractical, still far worse than 10-15 error rate requied for practical computers.  ← Their impractical fault-tolerant operation has adverse effect increasing error rates to 2.5 % ~ 20 % (= the error rate or infidelity of two logical qubits consisting of 16 physical ion qubits is especially bad and high,  this p.3-right-last-paragraph, p.4-right ) from the original two-qubit error rate or infidelity (= 0.3~1 %,   this p.3 ).

↑ It is extremely difficult to find and correct errors of such error-prone quantum computers even with very small qubits, so each time they find some errors using a flag ancilla qubit, the current (deceptive) fault-tolerant operations just discard and repeat the same qubit-calculation-operations, or post-select only convenient answers ( this p.7-left-2nd-paragraph ) instead of locating and correcting errors ( this p.2-right-upper,  this p.3-left-2nd-paragraph,  this p.4-last-paragraph,  this Fig.2, p.8-left ), which inefficient error-discard-and-repeat methods take too much time and cannot reach final correct answers, when the calculations are so large and complex as to cause at least one error in the calculation process.

QuEra-Harvard-MIT atomic qubits are impractical, full of errors and hypes  (2/7/2024).

Harvard-QuEra fault-tolerant qunatum computer (= Not real error-correction but just dicarding error qubits, postselectively ) is disastrous, still error-prone (= due to inability to correct errors ), far from practical use.

(Fig.H) Harvard-QuEra used only 280 unstable atoms as qubits with extremely high error rates and No ability to correct errors.  ← practical quantum computer is hopeless

QuEra-Harvard's error-prone quantum computer with only 280 qubits is far from practical computer.

The recent QuEra-Harvard's overhyped "game-changing" quantum computer with only 280 neutral atoms (= each atomic two energy levels are used as bit state 0 and 1 ) trapped in laser light or 280 physical qubits (which were divided into 48 logical qubits, this 5th-paragraph ) is still error-prone, and far from the practical quantum computer with fewer errors that will require millions of qubits.

The 8th and 12th paragraphs of this hyped news say
"But qubits can easily be disturbed, making them notoriously error-prone. Roughly 1 in 1,000 fail, versus 1 in 1 billion billion bits in conventional computers (= conventional classical computer has far less errors than the impractical error-prone quantum computer )."

"In 2023, the Google Quantum AI Lab demonstrated a 2.9% error rate using three logical qubits; Quera's error rate is 0.5% with 48 logical qubits (= this is fake news, QuEra's error rate is very bad = more than 90%, as I explain below ), which falls far short of practically-required error rate of 10-15."

QuEra-Harvard's new quantum computer can Not correct errors, instead, they just 'discarded' error-detected qubits, which is useless.

The important point is that this QuEra's latest alleged-fault-tolerant quantum computer is still not only error-prone but also unable to correct errors like Google's quantum computer ( this 3rd-paragraph ).

This 6~7th paragraphs say
"Instead of fixing mistakes (= Not error correction ) that occur during computations, the processor used by the Harvard team incorporates a post-processing error-detection phase. During this phase, erroneous outputs are discovered and discarded (= error-qubtis were just discarded without being corrected )."

This 4th-last paragraph say
"What the Harvard team’s processor does, rather than correct errors during calculations, is add a post-processing error-detection phase wherein erroneous results are identified and rejected (= No error correction )."

↑ This means when calculating large numbers that will cause errors to a lot of qubits, all those error qubits must be discarded without being corrected, and as a result, almost all qubits are unusable for answers of the calculations due to being discarded, so this fault-tolerant quantum computer is useless.

Discarding all qubits which showed errors without error correction is meaningless.

This original paper (= p.4-left-2nd-3rd-paragraphs,  or this p.4-right ) says
"Averaged over the five computation logicals, we find that, by using the fault-tolerant initialization ( postselecting on the ancilla logical flag not detecting errors = discarded all qubits that showed errors without error-correction ).. physical two-qubit gate fidelity (= 99.5% = so error rate 0.5% in the last paragraph of this hyped news says about this only two-qubit-gate, Not 48 logical qubits )"

"Furthermore, we can postselect on all stabilizers of our computation logicals being correct; using this error-detection approach (= meaning discarding all qubits that showed errors detected by stabilizers, without error-correction ), the GHZ fidelity (= how correctly GHZ state = 0000 or 1111 expressed by four logical qubits can be generated ) increases to 99.85%, at the cost of postselection overhead"

"for example, discarding just 50% of the data improves GHZ fidelity to approximately 90%. (= by discarding 50% of all qubits, which showed errors, they could generate four-logical qubit state of 0000 or 1111 called GHZ state with the probability of 90% and 10% error rate. This Fig.3d shows GHZ fidelity = 0.9 in acceptance fraction of 0.5 meaning 50% qubits discarded without error correction )"

↑ Postselecting on ancilla logical flag or stabilizers of being correct means they used about half of all qubits as error-detection qubits instead of qubits for calculation (= each logical qubit consists of 7 data and error-detection physical qubits, this Fig.3a ) called ancilla qubits or stabilizers, and they only post-selected qubits whose ancilla or stabilizer error-detection qubits showed no errors by discarding all the remaining error qubits.

And even after discarding 50% of all (atomic) qubits due to their showing errors, the remaining qubits showed 10% errors (= 90% fidelity ) in only four logical qubits (= 0000 or 1111 ), which high error rate is too bad, and impractical.

Error-detection qubits called ancilla or stabilizers showed errors. → All qubits related to them were discarded without error correction.

This "Quantum error correction with surface codes" says
"Bob wants to send Alice a single bit that reads "1" across a noisy communication channel. Recognizing that the message is lost if the bit flips to “0”, Bob instead sends three bits: “111 (= one logical qubit consisting of 3 physical qubits )”. If one erroneously flips, Alice could take a majority vote = error can be corrected like 101 → 111 (= a simple error-correcting code )"

"we arrange two types of qubits on a checkerboard. Data qubits (= used for calculation ) on the vertices make up the logical qubit, while measure qubits (= error detection qubits ) at the center of each square are used for so-called stabilizer (= error detection ) measurements. These measurements tell us whether the qubits are all the same, as desired, or different, signaling that an error occurred"

This p.2-left-4th-paragraph says
"this code cannot be used to correct such errors. We thus perform state stabilization by post-selecting runs in which no error is detected by the stabilizer measurements (= when stabilizer qubits detect errors, all relevant qubits are discarded without error correction )"

"Fidelity (= XEB )" is equal to "1 - error rate"  so fidelity of 90% means error rate of 10%.

Fidelity is equal to 1 minus error rate ( this p.59,  this p.3-left-2nd-paragraph ).
So 90% or 0.9 fidelity means error rate of 10% or 0.1 (= 1 - 0.1 ).

In this research, "XEB (= FXEB )" was used as fidelity ( this p.7,  this p.20-upper ) where XEB (= fidelity ) = 1 means no error, and XEB = 0 means 100% error rate.

This p.5-right-4th-paragraph says
"To characterize the distribution overlap, we use the cross-entropy benchmark (XEB), which is a weighted sum between the measured probability distribution and the ideal calculated distribution, normalized such that XEB = 1 corresponds to perfectly reproducing the ideal distribution (= no error ), and XEB = 0 corresponds to the uniform distribution, which occurs when circuits are overwhelmed by noise (= error rate 100% )..
We note that the XEB should be a good fidelity benchmark"

QuEra quantum computer of 48-logical qubits is extremely error-prone with 90% error rate even after 99% qubits that showed errors were discarded, which fault-tolerant method is impractical.

This QuEra-Harvard's fault-tolerant quantum computer with 280 physical qubits devided into 48 logical qubits is still error-prone, impractical, unable to correct their errors.

This p.5-right-5th-paragraph says
"We obtain an XEB (= equal to fidelity or 1 minus error ) of approximately 0.1 for 48 logical qubits"

↑ This means QuEra's 48-logical-qubit quantum computer's error rate is too bad 90% error ! (= 0.9 = 1 - 0.1 ) which is completely useless, and this-3rd-last-paragraph of 0.5% error rate on QuEra's 48 qubits is false and just hype.

Furthermore, this very bad fidelity = 0.1 was obtained after they discarded more than 99% of all qubits (= accepted fraction is less than 0.01 in 48 qubits, this Fig.5e ), whose ancilla or stabilizer qubits detected errors.

This p.12-left-3rd-paragraph says
"Typically, error detection refers to discarding (or postselecting) measurements in which any stabilizer errors occurred (= No mention of error correction )"

This Fig.7c says
"48-qubit XEB sliding-scale error-detection (= Not error-correction ) data. The point with full postselection on all stabilizers being perfect returned only eight samples (= all other 138600 samples were miserably discarded due to showing errors, Fig.7a )"

As a result, QuEra 48-logical-qubit quantum computer's error rate is more than 90%, which is too impractical and they can Not correct errors (= instead, they just discarded erroneous qubits ).

This 13-14th paragraphs also say
"they needed to postselect on the event that no errors occurred (= "postselect no-error qubits" means "discarded error-qubits" ), which happened about 0.1% of the time (= 99% qubits showed errors, hence get discarded ) with their largest circuits. This just further underscores that they haven’t yet demonstrated a full error-correction cycle (= No error correction ).
They don’t claim to have demonstrated quantum supremacy with their logical qubits (= meaning ordinary classical computer is far faster and more useful than the current error-prone impractical quantum computer )."

Quantum computers cannot correct errors (= only ordinary classical computers can correct errors )

This 17th paragraph says
"This isn't full error correction. "What is happening in the paper is that the errors are corrected only after the calculation is done (= which means the calculated results by their quantum computers are still error-prone even after discarding errors, without error-correction ). So, what we have Not demonstrated yet is mid-circuit correction, where during the calculation, we measure... an indication of whether there's an error, correct it, and move forward."

This p.16-right-2nd-paragraph says their (deceptive) error correction does Not mean the error correction of the error-prone quantum computer hardware by physically flipping qubits. Instead, it just means classical computer-simulated software correction ( this p.17-right-IX. ) due to the impractically-error-prone quantum computer hardware.

Caltech's "erasure" errors also just discarded qubits that showed errors instead of correcting errors, which is impractical.

The recent Caltech's "erasure" errors published in Nature is also misleading, because they did Not correct the detected errors at all.

They used only less than 20 atomic qubits (= that is far from millions of qubits required for practical quantum computer ) where two atomic energy levels (= pseudo-ground and excited states ) were used as each qubit's 0 (= g ) and 1 (= r ) states ( this p.1-Fig.1 ).

And they just discarded (= or erased, excised,  this p.2-left-last-paragraph ) all qubits that showed errors, instead of correcting errors.

Error rates for preparing each two-qubit Bell-state (= one atomic qubit is ground-state and the other atomic qubit is the excited state ) is very bad = 5% = 0.05 (= fidelity is 0.95 ).

Even after artificially discarding these qubits showing errors (= instead of correcting errors ), their final error rate is as bad as 0.4% (= fidelity is 0.996,  this-4th-last-paragraph,  this p.2-Fig.1-right ), which is far from the practically-required error rate of 10-15.

↑ "SPAM corrected" also just discared error-qubits and post-selected only other intact qubits without correcting errors ( this p.7-Fig.5 ).

The error-correction of such useless quantum computers is still just pie-in-the-sky empty theory with No experimental realization ( this 5th-last-paragraph,  this 4th-last-paragraph,  this 4th-last-paragraph,  this p.32-conclusion,  this 7th-paragraph ).

Quantum computer is useless for calculating molecular energies or drug discovery forever.

IBM's current largest quantum computers are easily surpassed by ordinary classical computers.  ← Quantum computers are dead.

(Fig.I)  IBM quantum computer is unable to get right answers bacause of its high error rates, and ordinary classical computers are far faster and more accurate than quantum computers.

There is No quantum computer's speed-up nor utility.

IBM's largest quantum computers still have only less than 500 qubits and extremely high error rates of more than 1% or 0.01 in two-qubit gate operation ( this p.4-Fig.S1-c ), which falls far short of practical computer that will require more than millions of qubits ( this summary,  this 2nd-paragraph ) and need much lower error rate of 10-15 ( this 3rd-paragraph,  this 22th-paragraph ) than the current error-prone quantum computers.

So IBM gave up correcting errors of their hopelessly error prone quantum computers, and tried to rely on illegitimate error(or noise)-mitigating model with No quantum supremacy ( this 4th-last-paragraph,  this 2nd-last-paragraph,  this p.2 ) where all IBM's quantum computer can do is increase its error rate meaninglessly, and they have to rely on the ordinary classical computer to reach right answers by extrapolation in this error-mitigating model without quantum computer's error correction ( this 4-6th, 16th-paragraphs ).

This 10th-paragraph says
"Paradoxically, IBM researchers controllably increased the noise in their quantum circuit to get even noisier, less accurate answers (= due to increasing errors ) and then extrapolated backward to estimate the answer the computer would have gotten if there had been no noise (= using ordinary classical computer by using error-mitigation model without quantum computer's error correction ). "

↑ Quantum computers alone can Never get right answers.

As shown in this and this, the unmitigated values (= IBM's quantum computer's raw results ) which are completely different from exact ideal values ( this Fig.2,3 ), IBM quantum computer itself always gives wrong answers, which is completely impractical and useless.

↑ After all, this illegitimate error-mitigation method also turned out to be meaningless with No utility, because ordinary classical computer or laptop could calculate and give more accurate answers much faster than IBM quantum computer's ad-hoc error-mitigation model ( this abstract,  this abstract ), hence No quantum advantage ( this 3rd-last paragraph ).

This 7-11th paragraphs say
"The IBM result seemed like a real gut punch classical computing, but Not enough to cause a knock-out. Within 2-weeks of the announcement,  researchers..rose to the challenge. They have pre-published a paper on their results and report that By adopting a tensor network approach, we can perform a classical simulation that is significantly more accurate than the results obtained by the quantum device. "

"Not to be outdone, a recent pre-print.. stated, Our classical simulations on a single core of a laptop are orders of magnitude faster than the reported wall time of the quantum simulations (= IBM 127-qubits' ad-hoc error-mitigation method was easily surpassed by the ordinary classical laptop computer )"

↑ After this IBM paper (= using dubious error-mitigation, Not legitimate error correction ) was published, many physicists showed that ordinary classical computers (= classical tensor-network simulation of quantum imaginary superposition ) outperformed any quantum computers (= still-impractical due to quantum computers' high error rates and inability to give right answers ) with arbitrary numbers of qubits ( this p.1-abstract, p.1-left ).  ← Quantum computers proved to be officially dead.

IBM dubious 127-qubit quantum computer's utility claim published in Nature was officially dead.

After many physicists denied the IBM dubious claim of 127-qubit quantum computer's speed-up or utility based on illegitimate error mitigation model in Nature, several papers officially disproving the IBM quantum computer's utility were published.

The 3rd-last paragraph of this news says
"Applying their network to the Ising-model problem, Tindall and his colleagues solved the problem on a classical computer with greater accuracy than achieved using the quantum one."

↑ This newly-published paper says
"we can perform a classical simulation that is significantly more accurate and precise than the results obtained from the quantum processor."

The recent Science paper ( this p.7-discussion ) also says
"Our work demonstrates that classical algorithms (= using ordinary classical laptop ) can simulate the expectation values of the quantum circuits corresponding to the kicked Ising dynamics experiment on 127 qubits (= IBM quantum computer ), not only faster than but also well-beyond the accuracy of the current quantum experiments."

↑ From the beginning, there was No such thing as (error-prone) quantum computer's utility nor speed-up (= as shown in the still-useless quantum computers ), there has been only unfounded hype.

Overhyped IBM's new 1000-qubit quantum computer (= still far from practical quantum comptuer requiring millions of qubits ) is still error-prone, completely useless.

Recently, IBM announced a new largest 1121-qubit quantum computer called Condor and a new smaller 133-qubit quantum computer called Heron.  ← This is weird, why is the smaller new version also needed ?

↑ Even this allegedly largest 1000-qubit quantum computer is far from commercialization of practical quantum computers that will require millions of qubits.

Again, the overhype-prone IBM has Not published how error-prone their dubious largest 1000-qubit quantum computer is ( this 6-7th-paragraphs talk only about qubit-size, No mention of error rate at all.  ← this is point ), because this new 1000-qubit quantum computer's error rate is still impractically high.

This is why IBM simultaneously released a new smaller version of allegedly-less-error-prone 133-qubit Heron quantum computer whose error rate is said to be (only) 3~5 times lower than the previous one (= No detailed published data except vague news,  this 2nd-paragraph,  this 1st,9th-paragraphs,  this 5th-paragraph ), which error rate is still much higher (= more than 0.1 % ) than the practically-required 10-15.

↑ Quantum computers are deadend with No hope nor progress.

IBM's error correction, which is Not true error correction, generates impractically high errors

The latest IBM (fake) error-correction research published in Nature in 2024 used their old Falcon quantum computer with only 27 superconducting qubits (= still Not a computer,  this p.2-right-Experimental results,  this 5th-paragraph ), which means IBM's latest 1000-qubit quantum computer is just dummy and still useless.

They used only seven qubits (= far from millions of qubits required for practical computer ), and when two (flag) qubits of those seven qubits produced errors, which were detected, they artificially discarded those error qubits and post-selected only remaining error-free qubits without error-correction ( this p.7-right-4th-paragraph ). = qubit-acceptance rate (= rate of remaining qubits ) is bad = only 25~75% ( this p.4-left-3rd-paragraph )

↑ In spite of this artificial discarding or postselection of intact qubits, each error-measurement operation caused and added extremely-high error rate of 22% (= each two-qubit gate error rate is also high = 2%, and their total error rates increased and accumulated by each error-detection operation instead of decreasing errors ! ), which is far worse than the practically-required error rate of 10-15 ( this p.11-left-6~8th-paragraphs ).

As a result, even in the latest research, the leading quantum computer companies can Not correct errors of even small numbers of qubits, which means quantum computer's research is already deadend and hopeless.

↑ Even researchers in the quantum computer leading company IBM (and Google ) only aim to publish papers as slaves to top journals (= and ruin their scientist careers by this useless pseudo-science which is masked by a lot of overhyped news disseminated by themselves just helping greedy academic journals ) instead of really aiming to build useful machines.

Quantum computers are completely useless for calculating molecular energies or discovering drugs.

They force practical classical computer to perform most of complicated molecular energy calculations, and try to make useless quantum computer take all credits of classical computer as (deceptive) hybrid computer.

(Fig.D)  Quantum computers with small numbers of qubits and high error rates are Not useful calculators at all.  ← Quantum computer's drug discovery will never happen.

It is impossible for such error-prone quantum computers to give right answers of molecular energy for drug development or climate change, which is the media-hype.

Physicists tried to illegitimately combine ordinary practical classical computer with the current useless error-prone quantum computers, and call them "(deceptive) hybrid computers (= called variational quantum eigensolver or VQE )" that are substantially classical computers, Not quantum computers.

↑ Almost all important complicated molecular energy calculatios (+ error correction of impractical quantum computer ) must be conducted by ordinary classical computers ( this 6~8th paragraphs,  this 5th-paragraph,  this 4~5th-paragraphs,  this 2nd-paragraph,  this p.32-33,  this p.7-Fig.1,  this-abstract-upper,p.2-left-2nd-paragraph ).

The current impractical quantum computers use only less than 50 bits or qubits ( this Fig.1 used only 6 qubits or 6 bitstring = still Not a computer at all,  this 1~4th paragraphs ) that are far inferior to practical classical computers with billions of bits and the (illusory) future practical quantum computer that will need millions of qubits.  There is No quantum computer's advantage in their phony hybrid computers' molecular calculation and chemistry ( this abstract, this 5th-paragraph,  this p.4-right-last,  this-introduction-4th-paragraph ) contrary to the media-hype ( this 4th-paragraph ).

↑ This (useless) quantum computer's part with only less than 20 qubits (= and error-prone ! ) can do nothing but play with very simple version of (fake) meaningless (Hamiltonian) equation in vain with No power to calculate any molecular energies by itself.

The recent research allegedly calculating hydrogen molecule's energy or Hamiltonian used only useless two-bitstring (= 01 ) ion quantum computer (= Not a computer at all ) with the help of practical classical computer which must compute most of complicated calculations of hydorgen molecular energy parameters ( this p.1-right-lower, footnote, p.5-experiment ).  ← These deceptive quantum computer researches just demonstrated the power of classical computers instead of the still-impractical quantum computers with small numbers of qubits (= only several qubits )

Don't be deceived by hyped news falsely claiming quantum computers might predict some biological or medical results.  ← In all cases, ordinary classical computers (= Not useless quantum computer ) have to perform actual calculations.

Don't be misled by overhyped news falsely claiming (error-prone useless) quantum computers might be used for predicting disease, though actually they relied on ordinary classical computers for most of their calculations ( this 2~3rd paragraphs = this 3.2, Figure 2 )

Some research tried to connect the electric conductance of different nucleotides (= measured by the electrode device, Not quantum computer ) to the still-useless quantum computer (= with only 3 qubits, Not a computer at all ) to identify nuclotides ( this 3rd-last-2nd-last paragraphs ).
↑ But after all, ordinary classical computer of quantum simulator (= Not quantum computer ) gave higher success rate than the error-prone IBM quantum computer with only 3qubits ( this p.15-2nd-paragraph, p.16 ).  ← No quantum computer's utility

"Researchers uses quantum computing to predict gene relationships ?" is also untrue.
↑ Researchers had to also use ordinary classical computer, and their impractical quantum computer with only 6 qubits and many errors can be easily simulated or emulated by classical computer's simulator called Qiskit Aer simulator ( this Fig.2, p.2-right, p.5-methods ) including errors ( this how can Qiskit Aer help ? )

Also in the latest misleadingly-exaggerated news, physicists are unable to calculate any (practical) molecular or chemical reactions, instead, all they could do was (wrongly) disguise only one (useless) trapped-ion or only one single atomic qubit ( this Fig,1g- Fig.2-top ) illuminated by laser light (= No chemical reaction ) as some (irrelevant) molecules involved in photosynthesis without conducting any quantum computer calculations using multiple qubits ( this 4-7th-paragraphs = using only one single ion qubit that is useless, this 7th-paragraph.  Another similar paper used only five ion qubits that are Not a computer, either, this 5th-last-paragraph,  this p.5-right-2nd-paragraph~p.6 ).

↑ These still-impractical quantum computers with only one or five (trapped ion) qubits (= disguised as some irrelevant chemical reaction without performing any molecular energy calculations ) are far inferior to ordinary practical classical computers or the (imaginary) future quantum computer which will allegedly need at least million qubits with No errors ( this 4th-paragraph,  this 2nd-paragraph ) for some useful molecular energy calculations, which is impossible to realize forever.

All types of qubits are hopeless, impractical forever.

(Fig.Q)  All types of quantum computers are extremely error-prone, unstable and impossible to scale up.

Photon quantum computer (= still Not computer ) is worst with highest error rates among all types of qubits.

Photon quantum computer allegedly using photons, beam splitter and photodetectors is impractical forever, because their photons (= just weak classical light wave ) whose light polarization is used as a qubit state are extrermely fragile and easily lost, causing miserably-high error rates.

Among all types of qubits, photon's quantum computers' error rates are worst and highest = more than 70% fragile photons (= just weak classical light ) are usually lost (which means their error rates are more than 70%, this p.1-right ) during operations, and each photon's gate operation's error rate is hopelessly high = 60~90% (= successful efficiency is only 11~40% = when two fragile photons enter one logic gate composed of ordinary beam splitter, only 24% of those photons can be detected, and all other photons are lost even in the latest research,  this p.9-left-first-paragraph, which means photon's quantum computer error rate per one logic gate is impractically high, worst = more than 76% due to severe photons' loss,  this 3rd-paragraph,  this p.1-left-second-paragraph ).

And building the simple two-qubit gate using photon qubits is extremely hard ( this p.1-right-second paragraph,  this photon qubit ).

Photon quantum computer error rate (= more than 50% ) is hopeless.

The recent misleadingly-hyped photon qubit news says ( this or this 4~6th paragraphs )
"The researchers at the University of Tokyo developed a new method for generating GKP states. Their method is based on the interference of cat states, which are superpositions of coherent states (= just mixed weak classical lights ). The cat states are generated using a process called four-wave mixing."

"The researchers demonstrated that their method can generate GKP states with high fidelity (= this is false ). They also showed that the states are robust to errors caused by photon loss (= this is also incorrect, actually, loss of many photons caused high error rate ) and phase noise."

"The development of new logical states for FTQC with propagating light is a significant step (= which means still useless ) forward for the field of quantum computing (← ? )"

↑ They mixed multiple light waves with some periodically-changing amplitudes and phases using ordinary beam splitters ( this p.2 ), which is called GKP light state that is said to be robust to errors or photon loss (= this is untrue ), because even when some photon or light is lost, the other remaining lights of the mixed GKP lights may fix it.

But photon qubit is known to be the most error-prone due to massive photons' loss, so practical quantum computer or error correction is definitely impossible.

↑ This research's supplementary material says
photon's (detection) efficiency through some paths is only 50% (= 50% error rate !  this p.3-2nd-paragraph )
This p.5-2nd-paragraph says the success probability of generating GKP light is only 0.5%  ← completely impractical.

↑ Actually, this original paper ( this p.5-left ) says
"Our current limiting factor is the optical loss in the system.. Our current setup has the success rate of about 10 Hz which is Not sufficient for actual computation."

↑ After all, this experiment just generated some (classically) mixed light wave called GKP state with extremely low success rate ( this-lower-challenge generation of GKP state success rate is only 1% ) due to massive photons' loss and many errors (= No computation nor error correction has been done ).  ← Photon-qubit quantum computer will never be realized, researchers only aim at publishing papers in journals.

Hypes about (impractical) photon-qubit quantum computer

The 1st and 10th paragraphs of this recent overhyped news say
"Scientists.. have found a powerful new way to program optical circuits that are critical to the delivery of future (= meaning "still useless" ) technologies such as unhackable communications networks and ultrafast quantum computers.
Quantum computers are expected (= just speculation ) to unlock big advances in areas including drug development, climate prediction (← ? )"

↑ Because of easy loss of photon (= or weak light ) exploited as fragile photon qubits, the success probability of gate operation (= just using ordinary beamsplitter and light detector with No useful computation ) is extremely low, as this paper (= p.5-left-last-paragraph, or this p.2-S.2-p.4 ) saying
"This allows us to measure the success probability of the gate operation, which is defined as the ratio of (light or photons') coincidence counts in the target output modes over the total coincidence counts integrated over all outputs in one polarization channel. We.. measure a success probability of 0.36 ± 0.01, 0.27 ± 0.03 and 0.18 ± 0.04 (= success rates were only 18 ~ 36%, impractically too bad ), respectively "

↑ This severe photon (whose light polarization is used as a qubit information ) loss and high error (= extremely low photon detection efficiency ) is why major companies such as Google and IBM try to use another type or superconducting-qubit quantum computers that also have impractically high error rates ( this 5,15th paragraphs,  this last-paragraph ).

Other qubits

Most major companies such as Google and IBM focus on another-type quantum computer using superconducting qubits (= just classical circuits ) which also cause impractically-high error rates ( this 5th-paragraph,  this p.1-introduction-2nd-paragraph ), and the necessity to always cool their bulky qubits (= each qubit is as big as millimeter ! this 3rd-last-paragraph ) to almost absolute zero makes it impossible to realize the (imaginary) practical quantum computers that will need millions of qubits ( this 4th-paragraph,  this 4th-paragraph ) forever.

D-Wave annealing machines for optimizing problems are Not true quantum computers nor faster than classical computers.

Topological quantum computer trying to use an illusory quasiparticle with fractional-charge called anyon or Majorana is unreal, fraudulent and impractical forever, Not achieving even a single qubit yet.

↑ The latest topological quantum computer is fake and unable to realize the true fault-tolerant quantum computer based on imaginary braids (= unseen non-existent braids ) at all ( this 9~10th paragraphs ).

Fluxonium qubits are useless.

MIT's new fluxonium qubits (= only 2 qubits !) are still error-prone, far from a (illusory) practical quantum computer that will need at least millions of qubits.

The current leading quantum computer companies such as IBM and Google use superconducting quantum bits or qubits (= which consist of just classical circuits, ) called transmon ( this p.6-left-1st-paragraph,  this 14th-paragraph ) whose error rate is the lowest (= still high, though ) among all types of qubits.

↑ But even this "best" transmon qubit suffers extremely high error rates, whose two-qubit gate error rate is 0.5 ~ 1% or 0.005 ~ 0.01 (= fidelity is 99.5 ~ 99 % ), which error rate is much higher than the current ordinary practical classical computers whose error rate is only less than 10-17 ( this 2nd-paragraph,  this 17th-paragraph,  this last-paragraph ).

The recent (hyped) news claimed that MIT researchers created (only) two qubits of new type of superconducting qubit called "fluxonium ( this Fig.1 )" whose two-qubit-gate error rate is 0.001 (= 99.9% accuracy or fidelity,  this 6th-paragraph ), which is still far worse than classical computer and the (illusory) practical quantum computer whose error rate must be less than 10-15 ( this 5th-paragraph,  this 3rd-paragraph,  this 7th-paragraph ), and the practical quantum computer will need at least millions of qubits ( this 19-21th paragraphs ) which is impossible to realize forever.

↑ In fact, this fluxonium qubit studied by MIT is Not new. However, No quantum computer companies wanted to replace their conventional transmon qubit by this fluxonium qubit ( this 2nd-last~3rd-last paragraphs,  this p.1-right-2~3rd paragraphs ).

Because this allegedly-new-type of fluxonium qubit is much more complicated ( this 5th-paragraph ) and much slower than the conventional transmon qubit, hence scaling up or increasing the number of these fluxonium qubits is impractical and impossible ( this 2nd-last-paragraph,  this-last~10th-last paragraphs,  this p.2-left-2nd-paragraph ).

↑ The time needed to operate two-qubit (= CZ ) gate in fluxonium qubits is about 2~3 times longer (= which means 2~3 times slower than conventional transmon qubit.  fluxonium two-qubit's gate time is 85 ~ 200 ns,  this last paragraph,  this p.7-left-2nd-paragraph,Fig.4,  this p.1-right-lower ) than the conventional transmon qubits whose gate time is 20~35 ns ( this p.2-left,  this Fig.1b,  this p.2 ).  ← No advantage, and hopeless situation of quantum computers is unchanged.

Ion qubits are hopeless.

Ion-qubit quantum computer of Quantinuum is extremely slower and unable to be scaled up to the practical computer.  ← useless forever.

Trapped-ion quantum computers using unstably-floating ions (= two energy levels ) as qubit are known to be very slow, impractical, still error-prone and impossible to scale up.  = still less then only 50 ion qubits ( this 11th-paragraph ).

The recent hyped news claimed that Quantinuum's H1 ( consisting of unstably-floating trapped ion qubits ) might successfully excecute fault-tolerant algorithm (= untrue ).

↑ Even in this Quantinuum latest ion-qubit computer consisting of only less than 20 (ion) qubits (= fall far short of million-qubit practical computer ), its error rate is extremely high (= each 2-qubit gate error rate is 2 × 10-3 or 0.13 %,  this p.3 ), and the error rate even after the so-called "(deceptive) fault-tolerant" operation is still much higher (> 1.0 × 10-3,  this 6th-paragraph ) than the error rates required for the practical computer (= practical computer's error rate must be less than 10-15,  this abstract,  this p.1-left-lower ).  ← still far from practical use.

This ion-qubit quantum computers adopted by Quantinuum and Honeywell can never be put to practical use like all other hopeless quantum computers.  ← Physicists have already known this inconvenient fact, but had No choice but to hide this disastrous truth under the current unrealistic dead-end mainstream physics.

First, it is impossible to scale up or increase the number of ion qubits to the practically-required million qubits, because precisely manipulating many unstably-floating ions (whose two energy levels are used as the qubit 0 and 1 states ) by many lasers is unrealistic ( this 12-13th-paragraphs,  this p.1-intro-1st-paragraph, this 2nd-paragraph,  this trapped-ion qubit section ).
Even the latest quantum computer consists of only less than 50 (ion) qubits ( this 4th-paragraph ).

Second, this ion-qubit quantum computer is more than 1000 times slower (= execution time is far longer ) than the current dominant superconducting qubits used by Google and IBM ( this p.4-right-3rd-paragraph ).

The gate time required for performing each two-qubit gate operation in ion qubits is longer than microseconds or μs (> 1000 ns,  this p.6,  this 2.2,  this p.2-left-1st-paragraph,  this p.5-right-lower ), which is far slower and longer than the gate time of superconducting qubits (= less than 50ns,  this p.2-left,  this Fig.1,  this p.11-Table.4 ).

Furthermore, this latest Quantinuum's H1 ion-qubit quantum computer could Not correct errors, instead, they just artifically "post-select" results for seemingly removing errors ( this p.7-left ), which ad-hoc method is inapplicable to the practical larger computer.

Even in the latest (exaggerated) research in 2023/11/29, they used only 51-ion-qubits (= this 5th-paragraph, ) with extremely high error rates = the error rate of preparing the initial state is as high as 25% (= fidelity is 0.75 ) and each two-qubit error rate is 2% (= fidelity is 0.98,  this p.8-appendix C,  this-middle two-qubit gate error ), which is far from the practical quantum computer requring million qubits and less than 10-15 error rate.

Also in the latest research in 2023/12, they could achieve only (impractical) 32 ion qubits with a lot of unsolved issues ( this 1st, 9-10th-paragraphs ).

↑ This latest 32-ion-qubit quantum computer's error rate is too bad (= 40% or fidelity is 0.6 ) in some random-number generating operations compared to the error-free practical classical computer ( this p.9-Fig.8 ).

As a result, the current hopeless situation of impossibility of practical quantum computer is unchanged (forever).

Quantum computer is deadend, still having only one impractical ion qubit (= Not a computer at all ).

The 1st, 4th, 10th, last paragraph of this hyped news say

"Researchers at ETH have managed to trap ions using static electric and magnetic fields and to perform quantum operations on them. In the future (= just speculation, still useless ), such traps could be used to realize quantum computers with far more quantum bits"

"In this way, it has been possible in recent years to build quantum computers with ion traps containing around 30 qubits (= far less than millions of qubits required for useful quantum computer ). Much larger quantum computers, however, cannot straightforwardly be realized with this technique (= deadend ). The oscillating fields make it difficult to combine several such traps on a single chip,"

" A single trapped ion (= only one ion qubit, which is Not a computer at all ), which can stay in the trap for several days, could now be moved arbitrarily on the chip, connecting points "as the crow flies" by controlling the different electrodes"

"As a next step, Home wants to trap two ions (= meaning this research just trapped only one ion qubit = Not a computer at all ) in neighboring Penning traps on the same chip.. This would (= just speculation ) be the definitive proof that quantum computers can be realized using ions in Penning traps (← false. Several ion qubits alone cannot prove quantum computers )"

Trapping only one ion (= only one bit ) cannot make a quantum computer that will need millions of qubits, and their research is already deadend with No progress.

↑ This research just trapped only one Be+ ion (= whose two energy levels were used as a bit or qubit's states 0 or 1 ), and moved it a little by controlling external electromagnetic field in Penning traps, and No quantum computer was realized, contrary to the overhyped headline "larger quantum computers".

Ordinary ion quantum computer (= still Not a computer ) can Not have or confine more than 32 ion qubits to one chip or one trap area, hence, they have to inconveniently move each ion to another Penning trap (= impractical, taking much time ) to realize interactions between more ions ( this-13-14th-paragraphs,  this p.1-left-1st-paragraph,  this p.1-left ).

↑ Even this latest research could trap only one ion (= just only one quantum bit or qubit 0 or 1 ) in very large bulky trap chip (= one qubit size is as large as 100μm, ~ 1mm far bigger and bulkier than the ordinary classical computer's compact bit or transistor of only 40nm size ), and move it slightly (= about 50μm ) slowly (= each motion took 4 milliseconds ) at cryogenic temperature (= 6.5K = impractical ) with No quantum computer's calculation.

Practical quantum computer is said to need more than millions of qubits ( this-1st-paragraph,  this-2nd-paragraph,  this-3rd-paragraph ), so the current ion-trap quantum computer, which can Not have more than 30 qubits, is completely hopeless like all other quantum computers.

This research paper ↓

p.1-right-last-paragraph says "a single beryllium Be ion confined in electric trap (= only one impractical ion qubit )"
p.2-Fig.1 shows one ion qubit is very big and bulky (= more than 100μm )
p.4-left-2nd-paragraph says "an ion was transported (= about 50μm ) in 4ms"
p.6-left-lower says "temperature of 6.5 K."

↑ Despite longtime researches across the world, practical quantum computers that will need millions of qubits are still a long way off (= which means unrealized forever ), and their quantum computer researches are regressing to only one (impractical) qubit instead of progressing to larger numbers of qubits with less errors.

 

Spin qubits are deadend.

Spin-qubit silicon-type quantum computer is extremely impractical, unstable, impossible to scale up.

Silicon-type quantum computers allegedly using a tiny electron spin dot as a qubit is far more impractical than other-type qubits.  ← They only achieved highly-impractical 12-qubits, which fall far short of the practical quantum computer requiring millions of qubits ( this-lower Disagreement with Intel,  this 10th-paragraph ).

↑ This silicon-type quantum computer (= still Not a computer at all ) just measures the atomic energy state or magnetic field ( this Fig.4 ) interacting with light ( this Fig.4 ), they can Not directly measure the (fictional) electron's spin itself.

Even the latest published spin-qubit quantum computer's research used only six electron qubits (= correctly, only two qubits consisting of six quantum dots,  this 4th-last paragraphs,  this Fig.2 ), which is far from the practical quantum computer needing more than million qubits ( this 2nd-last-paragraph,  this 3rd-paragraph ).

↑ And their error rate of two-qubit CNOT gate is impractically high = 3.7 % (= 3.7 × 10-2 = 100 % - fidelity 96.3 %,  this-abstract, p.4-Fig.4,  this 3~5th-paragraphs ), which is also far from the practically-required error rate of 10-15 ( this 3rd-paragraph,  this last-paragraph ).

The latest single-electron charge qubit ( = each electron's motion pattern is used as quantum bit's state 0 or 1 ) also has impractically high error rate (= single qubit's read out error is about 2% = 100-98% fidelity, this p.2-left ), and they did Not state the two-qubit error rates because the stable two-qubit gate operation is impossible.

Neutral atomic qubits are hopeless.

Harvard's neutral atomic qubits and Chinese photon quantum computer's (fake) advantage are also useless pseudo-science.

Harvard's and QuEra's recent quantum computer published in Nature uses only less than 60 neutral atoms as 60 qubits with extremely-high two-qubit error rate 0.5% or 5 × 10-3 (= 99.5% fidelity,  this 7th-paragraph,  this Fig.3 ), which are far from the practical quantum computer which will require millions of qubits with less than 10-15 error rate ( this 7th-paragraph ).

Their quantum computer (= still Not a computer or calculator ) relies on cold neutral atoms (= ex. rubidium atoms, each atomic two energy levels are used as a qubit's 0 or 1 state ) weakly trapped in laser optical tweezers (= success rate of trapping atoms in laser is only 50% = too unstable and erroneous !  this p.8-lower,  this p.1-right ) where each atomc or qubit easily escapes (= broken ) from the weak optical lattice trap ( this 2nd-paragraph,  this 1.1 trapped neutral atom ), which makes it impossible to scale up the neutral atomic qubits to practical quantum computer forever.

These neutral atomic qubits try to use the unstable Rydberg state or high-energy atomic orbital causing weak electric dipole to weakly interact with other atomic qubits which qubit speed is extremely slower ( this 16th-paragraph ), their error rates tend to be impractically high (= three qubit error rate for making 000 or 111 GHZ states is very high = more than 10 % or 90% fidelity,  this p.10-right, Fig.4 ), and these atomic qubits cannot be connected with distant qubits ( this p.15 ), which is impractical forever.

Another startup company announced the dubious first 1000 atomic-qubit quantum computer (= still Not a computer, though ) whose alleged 1000 trapped neutral atoms (= used as qubits ) easily get broken within only 100s ( this 11-12th-paragraphs ), and they cannot correct errors at all (= this atom computing company has Not published the detailed error rates. this-last-paragraph,  this-12th-13th-paragraph = because their error rate is still impractically high. this 6~7th-paragraphs,  To reduce error rates, the number of atomic qubits must be drastically reduced to less than only 60 qubits as shown in the recent Harvard's atomic quantum computer's case whose error rate is still impractically high )

↑ The current deadend quantum mechanical "pseudo-science" is why academic organizations focus only on "politics" and raising tuition, instead of teaching really useful "science".

This Science paper used less than 23 CaF molecules as only less than 23 atomic qubits (= each atomic two energy levels are used as bit's state 0 and 1 ), and their error rate of just preparing single molecular bit's 0 state is as high as 18 % (= fidelity is 82.4%,  this p.9-right-2nd-paragraph, Fig.2 ), which is far from practical quantum computer which will need millions of qubits and extremely lower error rate of 10-15.

Chinese recent overhyped photon's quantum computer advantage (= based on useless random boson-sampling ) is also illusion fabricated by the wrong assumption of (unseen) quantum mechanical photon's parallel-world superposition compared with non-existent classical photon-ball.

Quantum computer's overhyped news

Atomic clock has nothing to do with ( error-prone ) quantum computers.

The 2nd, 7th, 8th, 9th paragraphs of this hyped news say

"Researchers recently explored the possibility (= just uncertain possibilities ) of using quantum computing techniques to further improve the performance of atomic clocks. Their paper, published in Nature Physics, introduces a new scheme that enables the simultaneous use of multiple atomic clocks to keep time with even greater precision (= lie, this research's error rate is far worse than ordinary atomic clock, and this research has nothing do to with quantum computers )."

"The researchers used their proposed technique to control individual atoms in atomic clocks (= unlike the practical atomic clocks treating many mixed atoms simultaneously, controlling individual atoms in this research caused impractically-worse errors ). Specifically, they ensured that each atom effectively experienced the passing of time slower or faster, depending on their dynamic position in relation to the applied laser beam."

"The current most precise clocks in the world work by measuring the passage of time with a large ensemble of atoms, but we demonstrate that individual control could lead to better performance (← untrue, controlling an individual atom or an unstable atomic qubit shows a far larger error rate ) More generally, our work shows the power of combining the features of quantum computers and quantum sensors (← untrue, this research has nothing to do with quantum computer nor sensors )"

"The initial findings.. are very encouraging, highlighting the potential (= just uncertain potential, still useless ) of quantum computing techniques in metrology research. In the future (= just speculation ), this study could inspire the development of other programmable quantum optical clocks"

↑ This research just used only two independent Sr atoms (= Not a quantum computer that will need a lot of atoms or qubits ) interacting with laser light, and measured the atomic energy levels' change with error rate far worse than the ordinary atomic clock, so useless research.

In the ordinary atomic clock, the wavelength or frequency (= energy ) of the mirowave is adjusted to agree with the atomic two energy levels' interval by illuminating a collection of atoms with various microwaves.

↑ The error rate of manipulating an atom in this research is far higher and worse (= fidelity = 1 - error rate is 0.984 = error rate is 0.016,  this p.3-left-last = average fidelity is 0.984 in this research ) than the ordinary atomic clock error rate of only 10-17 ~ 10-18.

↑ So this research is useless not only for (error-prone) quantum computer but also for atomic clock.

Quantum computers must manipulate each atom, which causes extremely large error rate inapplicable to precise atomic clocks (= so this news claiming such error-prone quantum computers might make time more precise is a total lie ).

To use each single atom or an ion for atomic clock, they have to usually take extremely much time = about 10 days ~ months ( this future plan,  this p.3-abstract ) at extremely-low temperature to estimate the precise average atomic energy frequency or time, which is completely impractical.

Actually, this research paper's p.1-abstract-last just says
"Our results demonstrate the potential (= just potential, still unrealized ) of fully programmable quantum optical clocks (= No mention of quantum computer nor practical atomic clock ) even without entanglement (= No entanglement means this research did No two-qubit operation required for computer's calculation ) and could be combined with metrologically useful entangled states in the future (= just speculate uncertain future )."

The world's largest 1000-atomic-qubit quantum computer is meaningless and useless.

Quantum computers are still error-prone and far from practical computer that will need millions of qubits.

The current world's largest quantum computer is said to be Atom Computing company's 1180 atomic qubits or IBM's 1121-superconducting qubits, but neither of which disclosed the detailed performance of the dubious machines ( this 1-3rd,12th-paragraphs  = D-Wave is Not a real quantum computer, so excluded ) due to their still-impractical error-prone quantum computers.

This 7~10th paragraphs say
"For quantum computers to dethrone today's best classical computers, however, scientists would still need a quantum processor with millions of qubits (= 1000 qubits can do nothing )."

"And that's a problem, because qubits are notoriously error-prone and need to be kept at near absolute zero (= energy-inefficient, impractical ).. One in 1,000,000,000,000,000,000 (billion billion) bits in conventional computers fails, while the failure rate in quantum computers is closer to 1 in 1,000 (= quantum computer is far more error-prone than ordinary classical computer )."

↑ So the current largest quantum computer (= still Not a computer nor calculator at all ) is far from the practical quantum computer that will need millions of qubits and far less error rates.

Quantum computers are said to calculate faster by exploiting ( still-unproven, fictional ) parallel-universe (= or superposition ) simultaneous computation.

But the recent research showed ordinary classical computer could calculate much faster and more accurately than the error-prone (impractical) quantum computer ( this 6th-paragraph ) even if classical computer was unfairly forced to emulate (still-unproven) quantum mechanical superposition or parallel worlds.

The latest largest 1305-atomic qubit quantum computer is still error-prone and useless.

The latest news claims researchers at TU Darmstadt have first created the world's largest 1305-atomic-qubit quantum computer whose properties were published in the peer-reviewed journal (= Actually, their largest quantum computer is still Not a computer, because they could calculate nothing by using it in this research ).

The 2nd, 4th, 7th paragraphs of the hyped news about this alleged world's largest quantum computer say

"Quantum processors based on two-dimensional arrays of optical tweezers, which are created using focused laser beams, are one of the most promising technologies for developing quantum computing and simulation that will (= uncertain future ) enable highly beneficial applications in the future (= just speculation, still useless quantum computer )."

"the team reports on the world's first successful experiment to realize a quantum-processing architecture that contains more than 1,000 atomic qubits (= still far from millions of qubits required for practical computer ) in one single plane."

"A total of 1,305 single-atom qubits were loaded in a quantum array with 3,000 trap sites and reassembled into defect-free target structures with up to 441 qubits."  ← So actually, they created only 441 atomic qubits (= Not 1000 qubits ) that were placed in the "designated" positions (= even arranging atoms or qubits in desired positions is still impossible, so useless ).

↑ In the atomic-qubit quantum computer, they try to trap multiple cold neutral atoms (= each atomic two energy levels are used as a bit's 0 or 1 states ) in laser light or optical lattice tweezer.

↑ The most serious problem which makes atomic-qubit quantum computer impractical is that those atoms easily disappear and drop out of (unstable) laser optical lattice trap ( this technical challenge ).

This research paper (= this p.3-left-results, p.4-left-1st-paragraph ) says
"an average loading efficiency of approximately 40% (= only 40% of atoms are trapped or loaded in laser light as qubits, and 60% atoms are lost )"

"With supercharging, the cumulative success probability increases to a value of 35% (= success rate of putting atoms onto right places is only 35%, and 65% error rate !  this-upper-figure-(a)-red line ), still not exhibiting saturation at the maximum implemented number of 50 assembly cycles"

↑ This research on the alleged world's largest atomic-qubit quantum computer (= still impractical, and erorr-prone ) did Not perform any calculations nor qubit-operation.

According to the recent unpublished Atom Computing's paper, they had to continuously load new atoms onto optical lattice at time intervals of 300ms (= the frequently-lost atomic qubits cause errors and cannot be used as practical memories ), because atoms or qubits constantly disappear with lifetime of less than only 30 seconds ( this p.6-right-Appendix B,  this p.9-2nd-paragraph ).

↑ In spite of this frequent loading of new atoms, still 1% atoms remained lost in only 1225 atomic qubits (= far from the practical millions of qubits, this 1st, 4th-paragraphs,  ← Atom Computing's paper does not show success probability or error rates of loading atoms ).

↑ Atom Computing's quantum computer also did Not calculate anything nor operate qubits due to focusing all of their time and energy only on atomic qubit loss ( this-abstract ).

As a result, quantum computers are already dead and hopeless with No progress, contrary to an incredible amount of overhyped news.

Hyped topological quantum computer with fictional anyon quasiparticle  in 2024.

Research on topological quantum computer with fictional non-Abelian anyon quasiparticle, which was allegedly less-error-prone (= wrong, actually very error-prone ), appearing in arxiv preprint a year ago was published in Nature recently (= 2/14/2024 ).

First of all, this topological quantum computer is just fiction and scientificially meaningless except for publishing papers in academic journals.

In topological quantum computer, fictional quasiparticles such as anyon (= Not real particle, this 3rd-paragraph ) and Majorana with fractional charge are said to entangle with each other by using fictional unseen braids (= Not physical string, this p.3-middle-2nd-paragraph ), and this nonphysical tied braids of anyon quasiparticles are said to make topological quantum computer robust or less error-prone (= only inside unproven imaginary mathematical theory ).

There is No evidence that such (fictional) anyon quasiparticles or magical braids (= Not real particle or braid, but just nonphysical math thing, this 10th-paragraph ) may exist.

This recent alleged non-Abelian topological quantum computer published in Nature just used 27 trapped ion qubits (= still far from millions of qubits required for practical quantum computer ), which are Not the authentic (fictional) anyon quasiparticles, and they are still error-prone (= due to No real magical braids ) and useless.

This 9th-paragraph says
"However, some researchers claim that Quantinuum has Not actually created non-Abelian anyons. They argue that the firm has instead merely simulated them ( this 2nd-last-paragraph )."  ← They tried to pretend that 27 ions ( this p.2-left-2nd-paragraph, p.4 ) were fictional anyon quasiparticles.

This last-paragraph says
"All in all, there are still questions on whether the "credible path to fault tolerant quantum computing" was unlocked ( this 10-13th-paragraphs )"  ← still No evidence of less errors

This 27-ion-qubit quantum computer (= still Not a computer ) pretending to be (fictional) braided anyon quasiparticles ( this p.27-30 ) still has very bad and high error rates of 35% (= 1.6% error rate per single qubit ) or fidelity (= 1 - error rate ) of 0.65 (= 0.984 per qubit, this p.12-A14,A15 ), which is far worse and higher than the error rate of 10-15 that is needed to be useful quantum computer.

Quantum computer is deadend, regressing to only one useless qubit, which is far from the practical millions of qubits.

The 3rd, 4th, 13th, 14th paragraphs of this hyped news say

"In a paper published in Nature Communications, the engineers describe how they used the 16 quantum 'states' of an antimony (= Sb ) atom to encode quantum information (= actually only two energy levels out of 16 splitting energy states of one Sb atom could be used as only one qubit's 0 and 1 in this research, this p.7-Gate fidelities,  which is still Not a computer )."

"Antimony is a heavy atom that can be implanted in a silicon chip, replacing one of the existing silicon atoms. It was chosen because its nucleus possesses eight distinct quantum states, plus an electron with two quantum states, resulting in a total of 8 x 2 = 16 quantum states, all within just one atom (= only one qubit ). Reaching the same number of states using simple quantum bits—or qubits (= this experiment used only one Sb atom or one qubit, which is Not a quantum computer that will need millions of qubits )"

"The quantum computers of the future will (= just speculation, still useless ) have millions, if not billions of qubits (= but this experiment used just one impractical atomic qubit ) working simultaneously to crunch numbers and simulate models in minutes that would (= just baseless speculation ) take today's supercomputers hundreds or even thousands of years to complete (= which overhyped description contradicts the latest result proving classical computer outperforming quantum computer )"

"While some teams around the world have made progress with large numbers of qubits, such as Google's 70 qubit model or IBM's version which has more than 1000, they require much larger spaces for their qubits to work without interfering with one another"

↑ The current leading superconducting qubit is inconveniently big (= one bit is millimeter ), bulky and energy-inefficient (= they have to be cooled to almost absolute zero ), which cannot be scaled up to millions of qubits required for the practical quantum computer ( this 3~4th-paragraphs ).

↑ This research tried to use "only one single Sb atom" as one qubit (= using two energy levels of a Sb atom as a bit's states 0 and 1,  this p.7-left-gate-fidelities-only one-qubit gate ) with still high-error rate.  ← Using only one bit or qubit (= even two qubit operation has Not been done in this research ) is Not a computer at all.

Still only-one qubit, high error rate, Not a computer at all.

The error rate of one qubit operation in this research is still impractically high = one qubit's error rate is 1 ~ 10% ( this p.17-Table S2,S3 ), which is far higher and worse than the practically-required error rate of less than 10-15.

To explain the observed multiple energy levels of a Sb atom under electromagnetic field, they just fitted free parameters to experiments ( this p.11-S11, p.12-S15 ) with No quantum mechanical prediction nor calculation.

The practical quantum computer is said to need more than millions of qubits ( this 7~8th-paragraphs ).
But even this latest research in 2024 could make only one unstable (atomic) qubit with impractically-high error rate.

So quantum computer research has been already deadend with No progress, rather, they are regressing to only one single useless qubit (= Sadly, publishing papers in journals instead of aiming to invent useful machines is the current research's only purpose ).

 

Quantum entanglement, teleportation are meaningless useless concepts sending No real information or doing No work.

Quantum network, internet use just very weak classical light as (fictitious) fragile error-prone photon whose fragility makes quantum network or internet impractical forever.  ← All quantum mechanical concepts are illusion and useless.

(Fig.C)  Quantum cryptography uses very weak classical light as fictitious photon's information which is easily lost and impractical forever.

The so-called faster-than-light entanglement and teleportation are junk unnecessary scientific concepts that can Not send any real information, much less faster than light ( this last-paragraph,  this 1-2nd pragraphs ).

Quantum information, internet, quantum network and cryptography are also fruitless, meaningless research which can never be of practical use ( this p.1-introduction,  this p.2-left-last-paragraph,  this p.1-last-paragraph,  this p.1-left-1st-paragraph ) .

The illusory quantum entanglement and teleportation are just the act of measurement (= called Bell-state-measurement or BSM ) of the polarization of photons (= weak polarized classical lights ) at light detectors combined with ordinary classical beam splitters (= the useless quantum entanglement or teleportation is Not an act of sending real information by quantum mechanical way ), so physicists must send this light's polarization information by ordinary classical (= Not quantum mechanical ) communication channel ( this lower ).  ← No quantum mechanics is involved.  And this meaningless Bell state measurement can only distinguish the light's polarization (= utilized as quantum information ) with low impractical efficiency = only 57.9% even in the latest research ( this 8th-paragraph,  this p.11 ).

↑ It means the (meaningless) quantum teleportation (= whose meaning is the same as entanglement ) can neither do real work nor send any real information, it needs conventional classical communication channel to send information or communicate ( this-lower-Is quantum teleportation faster than..,  this Please note that ).
Quantum entanglement can be explained or replaced by weak classical light wave.  ← Unphysical photon particle is unnecessary.

The latest teleportation research (= this-1st-paragraph ) also did Not send any information nor images (= except for classically sending lights ). All they did was just (meaningless) "measurement (= BSM )" of weak lights or photons ( this p.2-Fig.1,  this Figure.1 ).

↑ Neither quantum entanglement nor teleportation can send real information, so quantum network or internet just tries to send information encoded in very fragile photons (polarization) or very weak classical light wave that is irrelevant to quantum mechancal (fictional) power in an ordinary classical way (= Not faster-than-light ), and those fragile photons or weak lights, which are easily lost and useless, make it impossible to realize (media-hyped) quantum internet or network forever ( this 2nd-last-paragraph,  this 2nd-paragraph,  this The Challenge ahead,  this p.1-introduction-first ).

↑ Quantum mechanics illogically tries to use this inconvenient easily-lost photon as the reason for "(illusory) secure" quantum cryptography or network (= which can be explained by classical weak light wave, quantum photon is unnecessary ), which is why such fragile-photon's quantum internet or network is impractical forever.

The inconvenient quantum network is Not allowed to amplify or copy the attenuated, easily-lost photons (= just weak classical light wave used as a photon qubit ).

So the detection efficiency of their weak fragile photon is impractically low even in the latest (overhyped) quantum network research (= this single photon detection efficiency is extremely low = only 0.035 or 3.5% = 97% photons are easily lost !   the two-photon coincidence detection rate is miserably far lower = only 8.5 × 10-6, this p.9-I.~p.10 ) after interacting with atoms (= that correspond to the still-impractical quantum repeater or memory,  this 10th-paragraph ) and traveling only 36 km which distance is far from practical internet ( this 3rd~ paragraphs,  this 13th~ paragraphs,  this 7-10th paragraphs,  this 2nd-last paragraph ).

The alleged secure quantum internet or cryptography's mechanism of detecting eavesdroppers destroying or altering (fictitious quantum) fragile information (= such as weak light or photon's polarization,  this 2nd-paragraph,  this 14-15th-paragraphs ), which is impractical ( this p.6-conclusion ) and unable to communicate while eavesdropping is happening ( this 11-14th-paragraphs ), can be naturally explained by classical mechanics (= because act of the measurement always alters the information, which can be detected, whether classical or quantum ).

↑ In this (useless) quantum cryptography, the sender and receiver are required to randomly choose different "methods (= called "basis" such as light polarizer's angle that affects the measurement results )" of measuring information.  ← Which measurement basis is chosen is unknown to eavesdropper beforehand, hence, they claim eavesdropper (who may choose wrong basis and obtain wrong information ) cannot copy the correct information obtained by receivers who may choose different measuring bases.  ← This cryptography mechanism based on choice of different measurement methods (= bases ) can be easily realized also in classical devices (= by inserting two different informations, which can be extracted by two different basis measurements, into one message, arranging for measurement of one information of the message to destroy the other information ), so No quantum mechanics is needed.

In the recent hyped research, they used the impractical error-prone IBM quantum computer with only four superdconducting qubits (= only four bitstring = Not a computer at all ) called "Trent" as a generator of only 4-bitstring information, and a sender (= Alice ) and a receiver (= Bob ) measured only 2 qubits of those 4 qubits as information using different measurement ways (= basis ) without sending qubit information over long distance.  ← The use of only 4-bit information communicated only inside the tiny impractical IBM quantum computer can Not be called a practical secure network ( this p.3-third-paragraph, Fig.3-bc, p.7-methods ).

Quantum information, internet hype

Quntum key distribution is impractical

The 3rd paragraph of this hyped news says
She investigated how the visibility of the so-called Hong-Ou-Mandel effect, a quantum interference effect (= wrong. This can be explained by classical light interference + photodetector that just detects electrons ejected by classical light, Not fictitious photon particles ), is affected by multiphoton contamination."

And the 2nd-last and last paragraphs of the same news say
"The findings could (= just speculation ) be important for quantum key distribution, which is necessary for secure communications in the future (= uncertain future )
However, many questions remain unanswered, Little research has been done into multiphoton effects, so a lot of work is still needed (= still useless quantum key distribution )"

Quantum memory is impractical forever.

Physicists try to develop "quantum memory (= usually some atomic energy levels are used for storing light energy or information )" that can allegedly store information of light or photon transiently.

But even in the latest research in 2023, the storing efficiency of this quantum memory is only 60% (= 40% light or photon information is lost, each time they try to store information in this impractical quantum memory ), which is deadend, unable to make useful quantum internet.

But the 2nd and last paragraphs of this hyped news say
"So far, developing these high-dimensional memories has proved challenging, and most attempts have not yielded satisfactory efficiencies"

"In the future, we will (= uncertain future, still useless ) establish high-dimensional quantum repeaters using high-dimensional quantum memories, enabling high-dimensional quantum communication between two or more remote quantum nodes (← ? )"

Quantum internet, network is joke, impractical forever, contrary to the media-hype.

The 10th, 13th, 14th paragraphs of this recent hyped news say

"A spinning electron (= wrong, an electron's spin is Not real spinning ), or electronic qubit, is very good at interacting with the environment (= wrong, there is still No practical qubit ), while the spinning nucleus of an atom, or nuclear qubit, is not. We've combined a qubit that is well known for interacting easily with light with a qubit that is well known for being very isolated, and thus preserving information for a long time (= this is wrong, too )"

"In general, then, light carries information through an optical fiber to the new device, which includes a stack of several tiny diamond waveguides that are each about 1,000 time smaller than a human hair. Several devices, then, could act as the nodes that control the flow of information in the quantum internet (= wrong, still far from internet )."

"The work described in Nature Photonics involves experiments with (only) One (tiny) device (= Not internet nor network connecting multiple devices at all ). Eventually, however, there could (= just speculation, still useless ) be hundreds or thousands of these on a microchip (= just imagination )"

↑ This research tried to use two atomic energy states or levels (= Not spin itself ) of a tin inside a tiny diamond as (quantum) bits 0 and 1 (= fictitious electron spin itself is unseen, they just measured two energy states interacting with lights, No quantum mechanical prediction nor calculation using spin was done in this research, just exerimental energy values were used, this p.3-left ).

Realization of practical quantum internet or network is impossible forever due to the severe loss of light or photon whose polarization or phase is used as bit information.

Even this latest research (= p.1-left-last~right ) admits
"However, spectral instability and low-rate coherent-photon emission remains an obstacle for scaling up quantum networks (= still impractical )."

First of all, the tiny device allegedly used as qubits in this research must be cooled to extremely low temperature = only 0.4 K (= almost absolute zero ! ), which is impractical ( this p.7-left-2nd-paragraph ).

They reported about 98% of light or photon entering this tiny device was lost, which means light or photon detection efficiency (after interacting with this tiny device) even in this latest research was only 1.7 % (= more than 98% error rate ), as shown in this p.5-6-Table S1

Probability of detecting two photons (= two weak lights ) is miserably low = only 5.0 × 10-5 = 0.00005 (= due to severe photon loss ) as shown in this p.4-left-2nd-paragraph and this Fig.3c.

As a result, quantum internet, quantum network and quantum computer trying to use fragile photon or very weak light as information messenger in vain are impractical forever, because photon detection efficiency or severe photon loss have Not been improved at all despite extremely long-time researches across the world.

Quantum internet is still joke, impractical forever.

The 2nd, 4th, 7th paragraphs of this hyped news say

"A team.. have taken a significant step toward (= meaning "still unrealized" ) the building of a quantum internet testbed by demonstrating a foundational quantum network measurement that employs room-temperature quantum memories. "

"While the vision of a quantum internet system is growing and the field has seen a surge in interest from researchers and the public at large, accompanied by a steep increase in the capital invested, an actual quantum internet prototype has Not been built (= even the prototype of quantum internet has Not been built )"

"They tested how identical these memories (= this quantum memory just means Rb atomic two energy levels' transition storing the incident weak light or photon energy, this p.6 quantum memories, p.2-results ) are in their functionality by sending identical quantum states (= just using weak classical polarized light or photon as quantum information carrier ) into each of the memories and performing a process called Hong-Ou-Mandel (= HOM ) Interference (= which is just interference of two classical lights after passing beam splitters ) on the outputs from the memories, a standard test to quantify the indistinguishability of photon properties."

↑ This research just sent two very weak (classical) light called (fictitious) photons through rubidium (= Rb ) atomic vapor (= each Rb atomic energy levels can store the incident light's energy transiently = only microsecond ) called quantum memory, and got two retrieved lights (= significant photon loss, so useless quantum memory ) interfering with each other at beam splitters.

This p.3-right-4th-paragraph says
"a mean photon number per pulse of ~1.6 serve as the inputs to the memories in a single-rail configuration. After storage and retrieval from each memory, we measure the mean signal photon number to be ~0.01 (= photon number drastically decreases from 1.6 to only 0.01 after stored in the impractical quantum memory ) at the inputs of the beam splitter"

According to this p.3-Table 1, the imput photons (= which is just weak classical light whose light intensity exceeds photodetector's threshold ) suffered significant loss while traveling and getting through quantum memory, and the photon number drastically decreased from 1.6 to only 0.00065 (= photon's detection efficiency is only 0.00065/1.6 = 0.0004 meaning 99.96% photons were lost ), which quantum memory or internet is completely impractical.

This paper p.6-left-3rd-paragraph also says
"several hurdles still exist toward using warm atomic vapor QMs (= quantum memories ) in memory-assisted measurements and operations. As evidenced in this study, mitigating the noise associated with these memories stands as a paramount challenge to achieve indistinguishable retrieved qubits."

As a result, the quantum internet or network is already deadend and impractical (forever) despite an incredible amount of time and money wasted for this (fruitless) research across the world.
Only unfounded hypes misleading laypersons remain.

 

to

2023/11/21 updated. Feel free to link to this site.