Quantum computer is hopeless, useless filled with only hypes and lies.

Top

Despite the longtime media-hypes, still Nobody (except for research and hypes ) uses quantum computers that proved to be useless, deadend forever.

Contents ↓

Quantum computer with a small number of error-prone qubits is still Not a computer. No supremacy.

Quantum computer is already dead, impractical forever. Only hypes remain.

(Fig.Q) Quantum computers that can Not give right answers due to their small numbers of qubits and large numbers of errors are far inferior to ordinary practical classical computers.

Quantum computer is impossible to realize forever.  Only overhyped news is rampant.

Contrary to overhyped news, quantum computers are useless forever.

Quantum computer is still Not a computer or calculator at all.

To hide this inconvenient fact of the already-deadend quantum computer, an incredible amount of overhyped fake news is spread every day.

Quantum computer's speed-up mechanicsm based on parallel universes is unrealistic.

Each quantum bit or qubit is said to take 0 and 1 states simultaneously using quantum superposition or parallel worlds, which have No evidence.

(Fig.S) One qubit can be observed to be one state of 0 or 1 after all.  Quantum superposition of one qubit being 0 and 1 parallel-world states is illusion.

The ordinary (classical) computer consists of many bits (= transistors ), and each bit can take 0 or 1 state.

Quantum computer uses two unstable energy levels of each atom, ion or artificial atom (= superconducting circuit ) as a quantum bit or qubit (= each atomic qubit's ground state = 0, and excited state = 1 ).

This qubit is said to take 0 and 1 states simultaneously using quantum superposition, a dead-and-alive cat or (fictional) parallel worlds, which (unrealistic) mechanism is the basis of the potential quantum computer's speed-up or supremacy ( this 4~7th-paragraphs,  this 10~11th-paragraphs ).

But physicists can observe only one single state 0 or 1 in one single world in quantum computer bits or qubits ( this 4~5th-paragraphs,  this 26~29th-paragraphs ).

So there is No evidence that a qubit can take 0 and 1 states using the magical (unrealistic) quantum superposition or parallel universes ( this p.1 ), which means No evidence of quantum computer's speed-up nor practical application.

Practical quantum computer is said to need millions of errorless qubits, which is impossible.

The present quantum computer had only a small number of qubits (= less than 1000 qubits ), which are far from millions of qubits required for a practical computer.

(Fig.M)  Quantum supremacy is said to need millions of qubits, which is impossible to achieve forever.

Today's (impractical) quantum computers have only 5 ~ 1000 qubits which fall far short of millions of qubits required for true quantum advantage.

The quantum computer is said to use a quantum bit or qubit, which typically consists of an atom or ion whose two (unstable) energy levels (= ground state = 0,  excited state = 1 ) are used as each bit's state.

The ordinary practical (classical) computers such as laptops have billions of bits (= each bit can take 0 or 1 ) or transistors ( this 13~14th-paragraphs ).

But the present quantum computers has only very small numbers of bits or qubits (= quantum bits ).  ← Most of quantum computers have only 1 ~ 50 qubits (= each qubit can take only 0 or 1, so 0 ~ 50 qubits mean just impractical 1 ~ 50 bitstring ), which are completely useless, Not a computer at all ( this 1st-paragraph,  this 6th-paragraph ).

The practical quantum computer is said to need more than millions of qubits ( this summary,  this 2nd-paragraph,  this 4th-paragraph,  this 11th-paragraphs ).

Even the current largest quantum computer of IBM has only 1000 error-prone qubits, which are far from millions of (errorless) qubits required for practical computers.

This overhyped news (= 4~5th, 9th paragraphs ) is about only two impractical qubits far less than the hyped millions of qubits.

All the present quantum computers are impractical, too error-prone to give right answers.  So quantum supremacy is fake.

Today's useless fake quantum supremacy comes from the wrong unfair comparison with classical computers, which is why quantum computers with still too many errors are still useless with No real speed-up.

(Fig.E)  Today's quantum computers are too error-prone to achieve quantum supremacy or advantage.

Today's quantum computers are useless with far higher error rates (= 10-3 ) than practical computer's error rate of 10-17

The biggest problem that makes it impossible to put quantum computers to practical use is a high error rate or noise.

The ordinary classical computer or a laptop has a extremely-low error rate of only 10-17 ~ 10-27, which means only one error in 1017 bit operations happens, and these errors can be easily corrected.

So the current (classical) computers used in our daily life have No errors.

On the other hand, all the current quantum computers have too high error rates of 10-3 (= 0.1% ) ~ 10-2 (= 1%) in each qubit operation, which means one error occurs in every 100~1000 operations ( this 2nd-paragraph ), which is far more error-prone than the practical classical computers ( this p.1-left-last-paragraph,  this 2nd-paragraph ).

Furthermore, the error correction of quantum computers is impossible (forever).

So all the current quantum computers are unable to give right answers when they try to calculate some values, which means quantum supremacy or advantage is fake.

Today's fake quantum supremasy is based on outputting random meaningless numbers into which a lot of quantum computer's errors can be hidden (= ordinary classical computers can more quickly output random numbers, so No real quantum supremacy ).

Quantum annealing as seen in scheduling and logistics is a fake quantum computer whose (fake) quantum advantage is due to unfair comparison with bad classical methods deliberately avoiding best faster classical methods.

The practical quantum computer is said to need much lower error rate of only 10-15 ( this 5th-paragraph,  this 3rd-paragraph,  this-abstract ), which is impossible to achieve.

 

Annealing machines such as D-Wave allegedly used for logistics or optimization problem are fake quantum computers with No quantum speed-up nor utility.

Fake quantum advantage in quantum annealing for optimization problem (= finding the lowest energy or faster scheduling ) is caused by deliberately choosing bad (= not best ) classical methods, so No quantum speed-up. Only hypes and lies remain in (useless) quanum annealing machines.

(Fig.D)  D-Wave quantum annealing machines are slower and worse than classical computers with good algorithm (= such as classical Selby ) in optimization problems, so No quantum computer's advantage.  Only quantum hypes remain.

Quantum annealing for scheduling, traffic is fake quantum computer with No advantage.

D-Wave annealing machine for optimization problem is Not a quantum computer, and Not faster.

Contrary to many hypes, quantum annealing machines of D-Wave and other companies for optimization problems (= finding the lowest energy as the final solution,   this p.4 ) applicable to traffics, logistics and decryption are Not real quantum computers.

So D-Wave and other quantum simulators for optimization problems are useless, unable to show quantum speed-up or advantage ( this 8th-paragraph,  this p.2-right-1st-paragraph ), as shown in the hopeless business of D-Wave.

See also this p.1-last,  this p.1-left-introduction-1st-paragraph,  this 6th-paragraph.

This p.2-2nd-paragraph-middle~lower says
"While the question of whether QA (= quantum annealing ) can provide a quantum speed-up over classical approaches is still subject of a scientific debate.. to the best of our knowledge, there is No industrial application where QA unquestionably outperforms classical heuristic algorithms."

D-Wave annealing machines use classical circuits controlled by magnetic field.  ← No quantum mechanics.

D-Wave annealing machines use (classical) superconducting circuits called flux qubit ( this p.6-7 ) whose electric current is controlled by external magnetic field ( this middle-flux qubit ).

In this flux qubit (= quantum bit ), direction of electric current ( clockwise = 0,  anti-clockwise = 1 ) represents each qubit's 0 or 1 states.

↑ No quantum mechanics is used.
Just the quantized de Broglie wave influences flux qubits.

Quantum annealing is useless for traveling salesmen problem, traffic, logistics.  ← No quantum advantage

D-Wave annealing machines showed No quantum speed-up or advantage in optimization problem such as travelling salesman, logistics, traffic, scheduling.

It is said that D-Wave annealing machines are good at optimization problems finding the lowest energy state or the shortest route as solutions.

This optimization problem includes finding the shortest route or way in traveling salesman problem, logistics, traffic flow and scheduling.

But in fact, D-Wave annealing machines were useless, showing No quantum advantage or speed-up in this optimization, traveling salesman problem, or logistics ( this 2nd-last-paragraph,  this p.1-abstract,  this p.1-intro-2nd-paragraph ).

This 5th-last-paragraph says "It is still not clear how to prove quantum advantage in this kind of quantum computation. In fact, it hasn't been proved yet that quantum annealing gives an advantage over classical optimization algorithms."

This abstract says
"The traveling salesman problem is a well-known NP-hard problem in combinatorial optimization...
It is found the quantum annealer can only handle a problem size of 8 or less nodes and its performance is subpar compared to the classical solver both in terms of time and accuracy."

Fake quantum speed-up in D-Wave comes from deliberately choosing bad classical methods (= simulated annealing )

There are various good classical methods (= ex. Selby ) which are faster and more accurate than D-Wave quantum annealing.  So there is No quantum advantage.

Fake quantum speed-up or advantage of D-Wave annealing machines is based on unfair comparison by deliberately choosing bad classical methods such as simulated annealing and Monte Carlo methods.

There are better classical methods such as Selby algorithm that can solve optimization problems much faster and more accurately than D-Wave quantum annealing machines ( this-3rd-paragraph,  this-2nd-paragraph,  this p.15 ).

This 15, 19th-paragraphs say

"So besides simulated annealing, there are two more classical algorithms that are actors in this story. One of them is quantum Monte Carlo, which is actually a classical optimization method,"

"What the Google paper finds is that Selby's algorithm, which runs on a classical computer, totally outperforms the D-Wave machine on all the instances they tested."

This p.6-left-C. says
"Based on the results presented here, one cannot claim a quantum speedup for D-Wave 2X,.. This is because a variety of heuristic classical algorithms can solve most instances of Chimera structured problems (= D-Wave quantum annealing ) much faster than SA, QMC, and the D-Wave 2X.. For instance, the Hamzede Freitas-Selby algorithm "

Classical Selby algorithm can give right answers more quickly than D-Wave quantum annealing.

This research paper says ↓

p.11-1st-paragraph says
"D-Wave fails to compute a ground state for all 20 instances (= D-Wave could Not obtain the lowest-energy ground state solution )"

p.15-7.9-3rd-paragraph says "Selby heuristic (= classical ) did not find the best known solution within the standard allowed time of 30 seconds, but when we allowed 100 seconds instead, the heuristic found the best known solution for all instances but one"

p.15-lower says "#Selby-best-100 number of instances for which the (classical) Selby heuristic found the best known solution within 100 seconds."
"#DW-best number of instances for which D-Wave found the best known solution"

p.16-Table 2 shows the classical Selby (=#Selby-best-100) obtained almost all the best solutions, while D-Wave (= #DW-best ) was unable to find the solutions in many instances.

↑ As a result, D-Wave quantum annealing is unable to obtain the right solution (= lowest-energy ground state ) in many cases, which is far inferior to the good classical Selby algorithm.

D-Wave and Harvard's optimization problems deliberately chose bad classical method to claim fake quantum advantage or supremacy.

Also in the recent optimization problems of (useless) Ising model, D-Wave and Harvard researchers deliberately picked up bad classical methods such as simulated annealing and Monte Carlo instead of the better classical Selby method to (falsely) claim quantum annealing (= QA ) advantage ( this p.5-left-Benchmarking against simulated annealing,  this p.1-right-1st-paragraph ).

The recent D-Wave's fake quantum supremacy is also due to unfair comparison with deliberately-chosen bad classical methods.  ← this D-Wave supremacy paper's p.47-last-paragraph used only slow classical methods such as various Monte Carlo (= MC ), simulated annealing (= SA ), simulated quantum annealing, which does Not include the faster best classical Selby algorithm.

↑ D-Wave's fake useless quantum annealing supremacy is shown in their company's bad profitability even after the latest research.

Other group's latest research showed D-Wave (= DW ) quantum annealing machine was worse than other classical computer's methods in some optimization problems ( this p.2-right-1st-paragraph, p.9-right-2nd-paragraph ).

So unlike the media-hype, there has been No quantum speed-up or advantage.

D-Wave quantum annealing (= QA, DW ) is slower and worse than even the bad classical method of simulated annealing (= SA ) in many cases.

This-abstract-lower,  this p.11-2~3rd-paragraphs

 

Quantum computer's supremacy is fake, useless.

Quantum computers' supremacy and advantage are illusion just outputting random meaningless numbers (= with high error rate ), Not faster, far inferior to classical computers.

(Fig.S)  The present error-prone quantum computers are unable to give right answers, so physicists tried to hide their errors into random meaningless numbers as fake supremacy.

Quantum supremacy or advantage turned out to be fake, ordinary classical computers are far superior and faster.

In fact, all the claims of quantum computer's supremacy and advantage are fake, useless, Not faster at all ( this-3rd-last-paragraph,  this classical computer triumphs overs quantum advantage ).

After Google team announced the so-called quantum supremacy or speed-up using a 53-qubit quantum computer, various teams showed that ordinary classical computers were actually faster than the quantum computer.

Chinese team and Canadian Xanadu's photon quantum computer's advantage was also fake and denied.  Classical computers were proved to be faster than quantum computers ( this 2nd-paragraph ).

Even the current largest 127-qubit IBM quantum computer was outperformed by ordinary classical computers, so it was impossible that Google or other teams' quantum computers with far less qubits could outperform classical computers.

This 2nd-paragraph says
"IBM readily admits that nobody has yet demonstrated quantum advantage"  ← No quantum computer's advantage.

There is No quantum supremacy nor advantage from the beginning, contrary to hypes.

(Fig.N)  Today's (fake) quantum supremacy claims are based on unfounded assumption of (fictional) quantum parallel worlds involved in giving random (meaningless) numbers

It is impossible to prove quantum supremacy based on the unfounded, fantasy superposition or parallel worlds only from random meaningless numbers including a lot of errors.

The present error-prone quantum computers can Not give right answers nor correct errors.

So there is No quantum supremacy nor advantage over the ordinary practical classical computer that can always give right answers with No errors.

Today's (fake) quantum supremacy or advantage experiments just try to hide a lot of errors produced by quantum computers into meaningless random numbers (= because random numbers remain random, even if they contain a lot of errors,  this 6~10th-paragraphs,  this 2nd-paragraph ).

Real quantum supremacy (= by correcting errors ) is said to need more than millions of qubits, which is impossible to realize in the present only less than 100 qubits (= each qubit can take only 0 or 1 state ) which is still Not a quantum computer nor calculator ( this 4th-paragraph,  this 8~10th-paragraphs ).

Todays' (fake) quantum supremacy or advantage claims are based on unfounded assumption that just 53 ~ 67 qubits gave random (meaningless) numbers by utilizing (unseen, fictional) quantum superposition or 253 ~ 267 parallel universes that cannot be emulated by the ordinary classical computer with a single world ( this 6~8th-paragraphs,  this 5th-paragraph ).

↑ But these unseen superposition or parallel worlds can Not be proved only from the random meaningless numbers with a lot of errors (= these high error rates of 65%~99.9% are also just optimistic guess, because true error rates cannot be obtained from random numbers,  this 6~11th-paragraphs,  this p.3(p.2)-1st-paragraph ).

↑ These random numbers are results of errors, Not of quantum mechanical calculation utilizing the unseen superposition or parallel worlds.

So there is No proof of quantum supremacy nor advantage from the beginning, as seen in today's impractical quantum computers ( this 5~6th-paragraphs ).

Real quantum supremacy or advantage is said to need millions of errorless qubits, which is impossible forever.

(Fig.M)  Millions of qubits are said to be necessary for true quantum advantage, which is impossible in today's useless quantum computers with far smaller numbers of qubits (= only 6 ~ 100 qubits )

Now it is said that more than millions of qubits (= each qubit can take 0 or 1 state ) will be necessary to match the current practical classical computers or achieve real quantum supremacy.

The present quantum computers with far less qubits are completely useless and hopeless ( this 4th-paragraph,  this-2nd-paragraph used only less than 10 impractical qubits ).

This 6th-paragraph says
"The point where quantum computers overtake classical ones is known as quantum supremacy, but achieving this milestone in a practical way would need a quantum computer with millions of qubits. The largest machine today has only about 1,000 qubits."

No quantum supremacy in today's impractical quantum computers with far less than millions of qubits.

Still No quantum supremacy or advantage.

The 1st. 4th, 7th, last paragraphs of this recent hyped news (8/12/2024) say ↓

"most professionals believing it will (= still unrealized ) outperform classical computers in certain tasks within a decade"  ← This says quantum computers did Not outperform classical computers so far (= all the quantum advantage or supremacy claims so far were fake ).

"quantum computing still faces significant technical hurdles. The survey identified two main challenges: scalability (33.1%) and error correction and fault tolerance (30.9%)"

"Currently, most quantum computing use is (deceptive) hybrid, combining quantum and classical approaches"  ← This hybrid computer is actually a classical computer, and the useless quantum computer can do nothing.

"However, the key question remains: When will quantum computers be truly useful and profitable ? The survey results suggest that many in the field believe this milestone is approaching, but significant work remains to be done."  ← Quantum computers are deadend, far from useful.

Google quantum computer uses only 53 superconducting qubits consisting of classical LC circuits, No quantum mechanics was involved.

(Fig.S)  A superconducting (= transmon) qubit is just a classical circuit where two different electric currents' energies or frequencies represent 0 and 1 bit states.

Google used a quantum computer named Sycamore consisting of 53 superconducting qubits called transmon or artificial atoms.

This superconducting transmon qubit used by major companies such as Google and IBM consists of a classical LC circuit with ordinary capacitors, inductors and Josephson junction where the different amounts of charge and current represent each qubit 0 or 1 or the middle-(fictional) superposition states ( this 4th-last~3rd-last-paragraphs ).

Cooper pairs where two negative electrons allegedly attract each other by fictional phonon quasiparticle inside superconductor (= real electrons cannot attract each other ! ) can be explained by classical electrons ( this p.4 ) attracting each other through positive nuclei (= fictional quantum mechanical quasiparticles are unnecessary ).

Each bit or qubit's 0 or 1 state can be manipulated by classical microwave ( this p.2 ).

This qubit gate operation by classical microwave pulse is said to generate (fictional unseen) quantum superposition or parallel world states of the simultaneous 0 and 1 states in each qubit, though there is No evidence of quantum superposition or parallel worlds.

Quantum supremacy is useless just outputting random meaningless numbers.

Quantum computer's (fake) supremacy and advantage experiments can only output random meaningless numbers into which they tried to hide errors.

The problem is the all the present quantum computers are too error-prone to give right answers.

Correcting quantum computer's errors is still impossible ( this 3rd-paragraph ).

This is why all the experiments claiming the (fake) quantum computer supremacy or advantage could just output random meaningless numbers into which physicists tried to hide inconvenient errors (= they tried to take advantage of random meaningless numbers remaining random even if they contain a lot of errors ).

This 8th-paragraph says
"Nothing useful in any practical sense — in fact, it is randomly generated, a random quantum circuit."

This 2nd-paragraph says
"The calculation has almost No practical use—it spits out a string of random numbers. It was chosen just to show that Sycamore can indeed work the way a quantum computer should. Useful quantum machines are many years away, the technical hurdles are huge,"

This 2nd, 16th paragraphs say
"Given the task of finding a pattern in a seemingly random series of numbers,"

"Sceptics also argue that Google has only solved a very narrow task, and that quantum computing is still a long way away from practical use (= just outputting random meaningless numbers with No practical use )"

This 7th-paragraph says
"Their program, run on more than 50 qubits, checks the output of a quantum random-number generator. Some critics have complained this is a contrived problem with a limited real-world application (= useless random number generator with a lot of errors )"

Quantum supremacy needs fictional quantum parallel worlds.

Quantum supremacy or advantage claim is fake, based on unfounded assumption that 53 qubits could use 253 unseen parallel universes to get random numbers.

(Fig.Q)  A superconducting (= transmon) qubit is just a classical circuit where two different electric currents' energies or frequencies represent 0 and 1 bit states.

Quantum supremacy is illusion, if quantum parallel worlds are unreal.

All the dubious quantum supremacy or advantage claims are based on unfounded assumption that each bit or qubit (= just classical circuit ) can take 0 and 1 states simultaneously using imaginary quantum superposition, a dead-alive cat living in fictional parallel worlds.

So Google Sycamore's 53 qubits were said to take as many as 253 different ( unseen ) quantum superposition or parallel universes (= they assume one qubit can take 2 states of 0 and 1 simultaneously × 53 qubits ) interfering with each other to obtain some random meaningless numbers ( this-5th-paragraph,  this-4th-paragraph,  this 8th-paragraph ).

↑ Though there is No evidence of quantum superposition (= a dead and alive cat, which is unseen ), parallel worlds or multiverse.  ← If quantum superposition or parallel worlds are unreal, No quantum supremacy.

The 6-7th, 9-10th, last paragraphs of this website says
"Quantum computers use quantum bits, or "qubits," which can exist as both 1 and 0 simultaneously. This bizarre consequence of quantum mechanics is called a superposition state"

"Google's new computer with 53 qubits can store (unseen) 253 values, or more than 10,000,000,000,000,000 (10 quadrillion) combinations (= just fictional unseen superposition state )"

"Google's quantum computer consists of microscopic circuits of superconducting metal that entangle 53 qubits in a complex superposition state. The entangled qubits generate a random (meaningless) number between zero and 253"

"For classical computers, it is much more difficult to compute the outcome of these operations, because it requires computing the probability of being in any one of the 253 possible states, where the 53 comes from the number of qubits (= classical computer was unfairly forced to simulate the fictional 253 superposition or parallel worlds, and outperformed quantum computers after all )"

"the field is still in its infancy and practical quantum computers remain far on the horizon (= useless fake quantum supremacy )"

Quantum supremacy based on unproven fictional superposition or parallel worlds is just fake.

But it is impossible to observe such unrealistic quantum superposition states or parallel universes.
All we can observe is only one single qubit's state 0 or 1 (= we can observe only one cat's state dead or alive due to "superposition collapse by measurement",  this 4~5th-paragraphs,  this 5-6th-paragaphs ) in only one single world ( this 26-29th-paragraphs ).

What they call quantum superposition qubit states of 0 and 1 is just transient intermediate state between 0 and 1 (= like when the direction of nuclear magnetic moment or ion's energy state is changing into another different direction or state ) or mixing 0 and 1 states classically called Hadamard gate where just two different electron currents are mixed classically, which gives only one of 0 or 1 states when measured with some probabilities ( this p.6,  this middle ) or in a (meaningless) random way.

Google quantum computer just illuminated each (classical) qubit by classical microwave pulse, and changed the qubit's state where No fantasied quantum superposition or parallel universes happened ( this p.2-right ).

↑ To prove (illusory) quantum superposition or parallel worlds are real, physicists have to calculate some (useful) values (= Not random meaningless numbers that conceal errors ) far faster than ordinary classical computers by using the quantum simultaneous parallel(-world)-calculation, whch is impossible in today's error-prone quantum computers that give only wrong answers.

Ordinary classical computer can take only one bit's state 0 or 1 in one realistic world at the same time, so they (wrongly) claim that quantum computers that could utilize (unseen fictional) 253 parallel universe superposition can not be imitated by classical computers in the (fake) quantum supremacy experiments ( this 6~8th-paragraph,  this middle ).

Quantum (fake) supremacy random numbers were generated by 99.8% errors.

There is No evidence that the unobservable quantum superposition or parallel universes actually happened in dubious quantum supremacy error-prone or noisy experiments

(Fig.E)  Today's quantum computers are too error-prone to give right answers. so No true quantum supremacy.

There is No evidence that such unrealistic quantum superposition (= a dead-and-alive cat ) states or parallel universes actually happened in the process of obtaining random meaningless numbers in the (dubious) quantum supremacy experiments, because their quantum computer could obtain only one random meaningless numbers at once with No parallel universes.

Ordinary classical computer can obtain arbitrary random numbers much more quickly than the current bulky quantum computer, so there is No quantum supremacy or speed-up at all from the beginning.

Furthermore, the present quantum computers are too error-prone to give right answers ( this-last-paragraph ), and error correction is impossible.

Google quantum computer's error rate was very bad = 99.8%, which high error rate cannot prove quantum supremacy nor advantage.

Error rate (= or noise rate ) of Google's Sycamore's quantum computer was said to be 99.8% (= only 0.2% fidelity = 1 - error rate,  this 9th-paragraph,  this p.16-upper,  this p.3-right,  this p.3-right-3rd-paragraph,  this p.8 ).

This p.3 (or p.2)-1st-paragraph says
"For example, Google’s recent quantum supremacy experiment estimated that their fidelity was merely ∼ 0.2% (i.e, the experiment was ~ 99.8% noise = 99.8% error rate ) and that their fidelity will further decrease as they scale their system"

It means random meaningless numbers obtained by Google or Quantinuum quantum computers (= still Not computers ) were just results of errors (= 99.8% of those random numbers were erroneous wrong numbers ), Not of the fictional quantum superposition or parallel worlds.

↑ This is why they could only output random meaningless numbers into which their 99.8% errors could be hidden (= random numbers remain random even if they contain a lot of errors ).

This 3rd-last~2nd-last paragraphs say
"Quantum computers are Not supreme against classical computers...
implement one very specific quantum sampling procedure with No practical applications"

"The Google computer also lacks the ability to correct errors, which may be key to making a full-fledged quantum computer."  ← Their random numbers were just results of errors or noise ( this p.51-upper ) with No quantum supremacy.

Quantinuum 56-ion qubits allegedly surpassing Google was also error-prone, unable to achieve quantum supremacy

In the recent overhyped quantum supremacy news, Quantinuum's (only) 56-ion qubits (= allegedly surpassing Google quantum computer ) just outputting random meaningless numbers were also too error-prone (= slightly better than Google, though ) to give right answer, and unable to achieve quantum supremacy or scale up.

This 6~11th paragraphs ( or this 4~7th-paragraphs ) say
"The point where quantum computers overtake classical ones is known as quantum supremacy, but achieving this milestone in a practical way would need a quantum computer with millions of qubits. The largest machine today has only about 1,000 qubits (= so like Google, this latest Quantinuum's only 56 qubits far less than millions of qubits can Not achieve quantum supremacy )"

"The reason we would need so many qubits for "quantum supremacy" is that they are inherently prone to error, so many would be needed to correct those errors (= quantum error correction is impossible )."

"The team tested the fidelity of (Quantinuum's) H2-1's output using what's known as the linear cross entropy benchmark (XEB). XEB spits out results between 0 (none of the output is error-free) and 1 (completely error-free)"  ← This XEB means fidelity (= 1 - error rate ) from 0 (= 100% error rate ) to 1 (= 0% error rate ).

"They registered an XEB (= fidelity ) result of approximately 0.002 with the (Google's) 53 superconducting qubits built into Sycamore (= Google fake supremacy's fidelity was only 0.2%, which means its error rate was 99.8%, which error-prone results cannot prove quantum supremacy )"

"But in the new study, Quantinuum scientists — achieved an XEB score of approximately 0.35. This means the H2 quantum computer can produce results without producing an error 35% of the time"  ← Even this latest Quantinuum's 56-qubit error rate was 65%, which could Not give right answers, hence No quantum supremacy.

↑ This is why Google and Quantinuum could only give random meaningless numbers into which their erroneous wrong results were hidden in their fake quantum supremacy experiments.

Chinese and Canadian Xanadu photon quantum computer's advantage is also fake and useless.

(Fig.C)  (fake) Quantum advantage needs the unfounded assumption that a photon can split into multiple photons by (fictional) quantum superposition or parallel worlds.  Ordinary classical light wave can split, so No quantum advantage.

Also in Chinese and Canadian Xanadu photon quantum computer's (fake) advantage experiments, they just randomly detected weak lights or about 100 photons with No meaningful computation, which is completely useless ( this-last-paragraph,  this-8th-paragraph-caveats,  this-last-paragraph,  this-last-paragraph ).

This fake quantum advantage claim is based on the false assumption that (non-existent, imaginary) classical photon balls are indivisible, unable to split into two paths at each beam splitter, while quantum photons (= just weak classical light wave ) could split and interfere with each other (= because quantum photon is just weak divisible classical wave ).

↑ They just detected about 100 random photons at photodetectors without any meaningful computation (= called random boson sampling,  this p.1-right ), and claimed these quantum photons might go through many (unseen, fictional) quantum superposition (= split into many paths ) or parallel universes until measured at photodetectors ( this-scenario,  this 2~4th-paragraphs,  this 3rd-paragraph,  this 2nd-paragraph,  this 3~5th-paragraphs ).

While the fictitious classical photon balls couldn't split nor utilize quantum parallel universes, hence, they wrongly claimed the quantum photon's splitting and interference at beam splitters cannot be simulated by (fictitious) indivisible classical photon balls, which might be quantum advantage ( this 4th-paragraph,  this 4-5th-paragraphs ).

Fist of all, there are No such things as indivisible classical photon balls, so if we treat the (fictional) quantum photons as ordinary divisible classical light wave, we can naturally explain this random light interference and detection in the (pseudo-)quantum advantage boson sampling by the classical wave, and the quantum advantage disappears and becomes meaningless.

Photon quantum computers (= still Not computers at all ) are known to be the most error-prone due to massive photon loss ( this p.1-right-1st-paragraph ), so it is impossible for such error-prone photon quantum computers to give right answers or outperform the ordinary practical classical computers ( this-abstract,  this 2nd-paragraph ).

Classical computer beat quantum computer supremacy after all.

Classical computers could outperform and beat the (error-prone) quantum computers even if they were unfairly forced to imitate non-existent quantum superposition or parallel universes.

(Fig.D)  Practical classical computer outperformed the (error-prone) quantum computer, even when forced to imitate fictional quantum parallel worlds..

There is No evidence that (fictional, unseen) quantum superposition or parallel universes actually happened, so there was No quantum supremacy nor advantage from the beginning.

Even when classical computers were unfairly forced to imitate such imaginary non-existent quantum simultaneous superposition states or parallel universes, the classical computers could outperform or beat the quantum computers, as I said.

So the idea of quantum supremacy, advantage or speed-up was officially dead ( Grover search algorithm is also meaningless and not faster ).

Faster classical computer was unfairly forced to simulate fictional quantum computer's parallel world's gate operation in fake quantum supremacy experiments.

Qubit's gate operation allegedly generating (fictional unseen) quantum superposition or parallel worlds could be simulated and outperformed by classical tensor network method.  ← No quantum supremacy after all.

In Google supremacy experiment, physicists just illuminated 53 (classical) superconducting circuits used as 53 qubits by various random microwaves to change qubits' states 0 ↔ 1 randomly (= each superconducting qubit's charge or current amount represents the qubit's 0 or 1 or the middle-superposition state ).

↑ Quantum mechanics ridiculously claims each qubit random ( gate ) operation ( this p.23 ) just illuminating qubits with (classical) microwave pulses caused each qubit to be in (fictional, unseen) superposition state = 0 and 1 states simultaneously using (fictional) parallel worlds ( this 3rd-paragraph,  this middle-superposition,  this-middle says "superposition" means just "probability" of measuring 0 or 1 state ).

So they (baselessly) insist that 53 qubits could take 253 (unseen) superposition states or 253 parallel universes, which could not be emulated by ordinary classical computer whose bit could take only 0 or 1 state.

But as I said, quantum superposition or parallel worlds are unobservable, each qubit is observed to be 0 or 1 state, so there is No evidence of quantum supremacy or speed-up based on quantum superposition (or parallel-world simultaneous calculations ) from the beginning.

In spite of No evidence of quantum superposition or parallel worlds, a classical computer was unfairly forced to simulate 253 quantum parallel worlds or superposition states.

Classical tensor network methods could simulate these (unseen) quantum superposition or parallel worlds efficiently and very quickly ( this p.3-2.2-2nd-paragraph,  this p.3~5 ), and outperformed the quantum computer's supremacy after all.

So quantum supremacy or advantage was officially dead (as seen in nobody using such useless quantum computers in daily life ).

 

Quantum computer is too error-prone to give right answers, and unable to correct their errors.

Quantum computer's error correction is disastrous, hopeless, unable to correct their errors.  = impractical forever.

(Fig.S)  Quantum error correction is disastrous, impracitcal forever, increasing errors instead of decreasing errors !

Quantum error correction is hopeless, deadend.

All the present quantum computers are useless, too error-prone to give right answers.

The 1st-paragraph of this news says
"quantum computers are extremely sensitive to noise introduced through interactions with their environment. This noise leads to computational errors, and today’s quantum computers are too error-prone to compute the answers to problems that are of practical utility"  ← Today's impractical quantum computers are too error-prone to give right answers.

And unlike the ordinary classical computers such as laptops, correcting these errors in quantum computers is impossible ( this challenge 2,  this 4th~ paragraphs ).

This last-paragraph says
"the jury is still out on how to overcome the problem of error-prone qubits"  ← Still No error correction.

The 3rd and 16th paragraphs of this recent hyped news says

"However, many such error-correction schemes involve duplicating information across hundreds or thousands of physical qubits at once, which quickly becomes hard to scale up in an efficient way."

"The framework is still theoretical (= just theory, No experimental realization )"

Practical quantum computer will need millions of qubits with extremely-low error rate of less than 10-15, which is impossible to realize.

Today's impractical quantum computers have only less than 1000 qubits with high error rate of 10-2 ~ 10-3, which cannot even give right answers, and error correction is impossible.

(Fig.P)  Future practical quantum computer is said to need more than millions of qubits and less than 10-15 error rate, which is impossible to achieve forever.

Today's quantum computer's error rate (= 10-3 ) is far worse than practically-required error rate of 10-15

Even the current best quantum computer's error rate of two-qubit gate operation is about 1 ~ 0.1% (= 10-2 ~ 10-3 ), which is far worse than the ordinary practical classical computer's error rate of less than 10-17 ( this p.1-left,  this 2nd-paragraph ).

Practical quantum computer is said to need far less error rate of less than 10-15 ( this 5th-paragraph,  this-3rd-paragraph,  this 2nd-paragraph,  this abstract ), and more than millions of qubits for correcting errors ( this 19~21th-paragraphs,  this 2~6th-paragraphs ), which is impossible to realize forever.

The 2nd, 22-23th paragraphs of this recent site says ↓

"However, a significant hurdle remains: error correction. With every step of a quantum computation, errors accumulate, ultimately leading to inaccurate results."

"If you have two nines of fidelity, i.e., 99%, one way forward is to use lots of qubits for error correction. It’s inefficient,"

"We may never achieve a physical error rate as low as 10-15, one error in a quadrillion operations, which would be comparable to modern-day transistors."  ← Today's error-prone quantum computers are No match for ordinary classical computers.

This 9th paragraph says
"They're highly susceptible to "noise" (interference), which leads to computational errors. One of the most common breeds of quantum computers is described as being noisy intermediate-scale devices. They're prone to error,"

Fidelity is equal to 1 - error rate.

Fidelity (= F ) is the opposite of error rate.
Fidelity F = 1 - error rate ( this p.3-left-2nd-paragraph,  this p.59 ).

So when the fidelity is 98%, the error rate is 2% ( this p.1-left-1st-paragraph,  this 18th-paragraph ).

This 3~4th-paragraphs say
"Even with a so-called state-of-the-art fidelity of 99.9 percent, an error still occurs in one out of every 1000 operations (= error rate is 0.1% ). In the realm of quantum computing, where complex algorithms require many thousands or even millions of qubit operations, such error rates are prohibitively high. Conducting long calculations becomes practically impossible without accumulating significant errors that render the results unreliable."

"By contrast, classical computers exhibit minuscule error rates. Thanks to highly reliable semiconductor technology and sophisticated error correction techniques, classical computers achieve error rates as low as one error per quintillion operations (that's one in 1018 )"

Google quantum computer supremacy was unreal due to too high error rate of 99.8%

(Fig.E)  Today's quantum comptuers are useless too error-prone with 65% ~ 99.8% error rates (= when just flipping qubits 10 ~ 20 times ), which cannot give right answers except for random meaningless numbers.

Each two-qubit gate operation error rate of the Google superconducting quantum computer is about 0.5 ~ 0.6 % (= 5.0 × 10-3 ~ 6.0 × 10-3,  this p.5-left-3rd-paragraph,  this p.3-left-5th-paragraph ), which error rate is still too high to calculate right values ( this abstract ).

In the Google's (fake) quantum supremacy experiment (= their supremacy has been denied now ), the total fidelity (= FXEB ) of random qubit operations ( after 53 qubits × 20 random-qubit operation cycles ) was only 0.2 % ( this p.3-right-1st-paragraph ).

↑ It means the final (accumulated) error (= noise ) rate of Google supremacy experiment was 99.8% (= 100 - 0.2 %,  this p.3(or p.2)-1st-paragraph,  this p.16-upper ), which is just an erroneous wrong answer.

↑ As a result, Google quantum supremacy random numbers were just results of errors (= 99.8% error rate ), Not of quantum mechanical meaningful calculations (= though they just did random meaningless operations ), so No evidence of quantum supremacy.

Quantinuum 56-ion-qubit random sampling was also useless, error-prone with more than 65~80% error rate.  ← No quantum supremacy

All the present (fake) quantum supremacy experiments try to hide errors into random meaningless numbers with No practical use, no advantage.

The recent overhyped Quantinuum's 56-ion-qubit quantum computer (= still Not a computer ) was also too error-prone to give right answers (= this is why they could deal with only random meaningless numbers,  this-middle-The supremacy definition leaves the door open for debate ).

This headline and 7th paragraph say
"This one only produces errors 65 percent of the time."  ← 65% error rate is too high to give right answers.

"The demonstration, detailed in this paper, involved an algorithm called random circuit sampling"

Quantinuum's 56 qubits still suffered 65% high error rate, while Google was 99.8% high error rate, which cannot give right answers.

This 2nd-paragraph says
" Quantinuum’s new 56-qubit computer, called H2-1, ran simulations without a single error about 35% of the time (= meaning still 65% high error rate ), while Google's – 0.2% of the time (= Google's quantum computer was 99.8% high error rate )"  ← This high error rate cannot give right answers nor prove quantum supremacy.

This 8-9th-paragraphs say
"They used the linear cross entropy benchmark (XEB = fidelity), which produces results between 0 (completely error-prone) and 1 (entirely error-free)."

"In this quantum performance, the H2-1 (= Quantinuum's 56 ion qubits ) hit an impressive high note. It achieved an XEB score of approximately 0.35, meaning it can produce error-free results 35% of the time (= still 65% high error rate ).

".. This score is a significant improvement over Google’s Sycamore, which registered an XEB (= fidelity ) result of about 0.002 (= Google's supremacy experiment's pseudo-random numbers were just results of 99.8% error rate, Not of quantum mechanical calculation, so No quantum supremacy ) in 2019"

True quantum advantage needs millions of errorless qubits, which is impossible forever.

This 6th-paragraph says
"The point where quantum computers overtake classical ones is known as "quantum supremacy," but achieving this milestone in a practical way would need a quantum computer with millions of qubits. The largest machine today has only about 1,000 qubits."  ← Even this latest Quantinuum has only 56 (impractical) ion qubits, far from millions of qubits required for true quantum supremacy.

Quantinuum research paper showed only 20 qubit gate operations caused 80% error rate, which cannot give right answers.

This research paper ↓

p.2-left-2nd-paragraph says "executed without a single error about 35% of the time"  ← It means 65% error rate.

p.2-right-3rd-paragraph-last says "the 2Q (= 2-qubit ) gate fidelities on H2 (= their ion-qubit quantum computer ) up to the 99.9% level (= each gate error rate was 0.1% ) achieved on H1 (= this error rate per 2-qubit gate operation is still too high )"

p.13-Fig.9-(a) shows the total fidelity (= 1 - error rate ) of Quantinuum 56-ion-qubits (= N = 56 ) after 20 qubit gate operations (= depth = 20 ) was only less than 0.2 (< 20% ), which means the total (accumulated) error rate was more than 80%.

↑ Fig.9a shows the fidelity after 12 qubit operations (= depth is 12 ) was 0.35, which means 65% high error rate.

p.13-right-2nd-paragraph says "none of these results (hardness in the absence of noise or easiness at fixed noise rate) is conclusive (= No evidence of quantum supremacy )"

↑ Even the latest quantum computer with the best (= lowest ) error rate could Not give right answers (= the final error rate was more than 80%, which would get worse in more qubit operations ), which is why they focus only on random meaningless numbers into which errors could be hidden.

IBM, Harvard quantum computers are also too error-prone to give right answers.

IBM 127-qubit quantum computer also had too high error rate (= error rate per 2-qubit gate operation was 1.17 × 10-2, this p.4-Fig.S1 ), which could Not give right answers.

IBM started to rely on illegitimate error-mitigation by classical computer without quantum error correction ( this 2nd-last-paragraph,  this 2nd-last~3rd-last paragraphs ).  ← Of course, this illegitimate error mitigation was outperformed by ordinary classical computer, No quantum advantage.

Harvard-QuEra latest 60-atomic qubits also had too high error rate (= error-rate per 2-qubit gate operation was 0.5% = 5.0 × 10-3, or fidelity was 99.5%,  this abstract ), which error rate cannot give right answers, so they started to rely on ad-hoc post-selection without error correction.

Topologically-protected robust quantum computer based on fictional quasiparticle is unreal, lacking evidence of "robust".

There is No evidence that the (illusory) topological quantum computer is robust against errors.

Because they have Not realized even one single topological qubit allegedly based on (fictional) Majorana or anyon quasiparticles, let alone the hyped robust quantum computer.

Error-prone quantum computers must make a (fictitious) logical qubit consisting of many qubits just to detect errors.

Error-prone quantum computers must rely on a (fictitious) logical qubit consisting of multiple data qubits (= for calculation ) and ancilla qubits used for indirectly detecting errors called syndrome or stabilizer measurement.

(Fig.Q)  One logical qubit for calculation consists of multiple physical qubits (= each physical qubit or data qubit can not be used for calculation directly ) for detecting errors.

One (logical) qubit for calculation consists of multiple physical qubits for correcting errors.

For today's error-prone quantum computers to detect errors, multiple (physical) qubits must be treated as one fictitious (logical) qubit.

This 5th (or 6th ) paragraph says
"scientists can form one logical qubit, storing the same information (= 0 or 1 ) in all of the constituent physical qubits. If one or more physical qubits fails, the calculation can continue because the information is stored elsewhere ( this 7th-paragraph )."

So for example, the quantum computer with 250 (physical) qubits cannot use 250 qubits directly for calculation, instead, it must split the original 250 qubits into a smaller number of (fictitious) logical qubits (= ex. 250 qubits is split into only 5 logical qubits = each logical qubit consisting of 50 physical qubits,  ← This 250-qubit quantum computer can utilize only 5 qubits or 5 bitstring 00000 for calculation ).

Only logical qubits (= each logical qubit can take only 0 or 1 state ) are important, available for actual calculation.  Today's error-prone quantum computer cannot realized even one reliable logical qubit, which is hopeless, still Not a computer.

When some physical qubits inside one (fictitious) logical qubit show errors, they correct these errors from the remaining intact physical qubits like a majority vote ( this 8~21th paragraphs,  this p.8 ).

Think about the case where one fictitious logical qubit consisting of 3 (physical) qubits.

↑ When one logical qubit (= 1 ) is expressed by three physical qubits (= 111 ), and one physical qubit shows errors, flipping 1 into 0 like 111 → 101, the remaining intact two physical qubits are used to correct one erroneous physical qubit like 101 → 111 by majority rule = the original intact logical qubit 1 can be recovered ( this p.4,  this p.4-2nd-paragraph,  This 5~6th paragraphs ).

So physical qubits are used for error detection instead of being used directly for calculation.

Only logical qubits (= each logical qubit consists of multiple physical qubits ) can be used for actual calculation (= one logical qubit can express only 0 or 1 bit state, and one physical qubit alone is not allowed to express 0 or 1 bit state for calculation,  this 3rd-4th paragaphs ).

In fact, the current error-prone qubits can Not implement the error correction that is said to need more than millions of qubits ( this 14~16th-paragraphs,  this p.1-right-1st-paragraph ), which is impossible to realize ( this 11~21th-paragraphs ).

This abstract says
"full fault-tolerant quantum computing (FTQC) based on quantum error correction code remains far beyond realization due to its extremely large requirement of high-precision physical qubits"

One logical qubit consists of multiple data physical qubits and error-detection ancilla qubits in the tricky quantum error correction.

The problem is quantum computer is Not allowed to measure each qubit directly, because the measurement is said to destroy the (unobservable) quantum superposition or parallel-world state ( this p.1-left ) needed for (fictional) simultaneous calculations.

This p.2-(ii) says
"Measurement destroys quantum information. In contrast to the classical case checking for errors is problematic. Monitoring means measuring, and measuring a general quantum state alters it. Thus it seems that any attempt at error correction must destroy important quantum information."

So in quantum computer, each logical qubit consists of multiple data qubits (= for storing data for calculations ) and multiple ancilla or stabilizers qubits which are 'entangled' with data physical qubits, and used for indirectly detecting errors of data qubits without destroying (fictitious) superposition states ( this p.6 ).

For example, in the upper figure-middle, each logical qubit consists of 6 qubits of 3 data qubits + 3 ancilla qubits.

↑ But today's error-prone quantum computers easily cause errors not only in data qubits but also in ancilla qubits in each measurement or qubit manipulation, so accurate detection and correction of errors are impossible.

Measurement of ancilla (= syndrome, stabilizer ) qubits for indirectly detecting errors.

In principle, when these error-detection ancilla physical qubits are measured to be show some error flag sign (= indirectly showing errors in data qubits ), the corresponding data qubits inside the same logical qubit, which are expected to have errors, should be corrected ( this middle ).

But the qubit operation correcting errors of qubits itself causes new errors, so this ideal quantum error correction is impossible.

This measurement of ancilla qubits for indirectly detecting errors is called syndrome ( this p.7-lower-Diagnose,  this p.4-right-upper ) or stabilizer measurement ( this p.2~3-Results,  this p.9~14,  this p.15 ).

This "Quantum error correction with surface codes" says
"Bob wants to send Alice a single bit that reads "1" across a noisy communication channel. Recognizing that the message is lost if the bit flips to "0", Bob instead sends three bits: “111 (= one logical qubit consisting of 3 physical qubits )”. If one erroneously flips, Alice could take a majority vote = error can be corrected like 101 → 111 (= a simple error-correcting code )"

"we arrange two types of qubits on a checkerboard. Data qubits (= used for calculation ) on the vertices make up the logical qubit, while measure qubits (= error detection qubits ) at the center of each square are used for so-called stabilizer (= error detection ) measurements. These measurements tell us whether the qubits are all the same, as desired, or different, signaling that an error occurred"

Quantum computer's error correction is useless, just increasing error rates instead of decreasing errors.

(Fig.C)  Today's impractical quantum comptuers just increase their errors by error correction operation, so quantum error correction is impossible.

The serious problem is that the current quantum computers are too error-prone to correct errors.

So when physicists try to detect and correct errors by manipulating qubits, ironically, this qubit manipulation for error correction itself increases or worsens the original error rate instead of decreasing it (= error-worsening operation ).

In Google's latest error correction experiment (= actually, they did Not correct errors, this 3rd-paragraph ), one logical qubit consisted of 49 (or 17 ) physical qubits, whose error rate increased by 3.0% each time error correction operation (= cycle ) was conducted ( this 3rd-last-paragraph,  this 3rd-last-paragraph ).

This 9~10th paragraphs say
"This means that a 3 x 3 surface code uses 9 data qubits, plus 8 measure qubits for a total of 17 making one logical qubit (= smaller logical qubit surface code ). A 5 x 5 surface code uses 25 data qubits, plus 24 measure qubits to make a total of 49 in one logical qubit (= larger logical qubit surface code )."

" the team found that the larger surface code (= one logical qubit consists of 49 physical qubits ) delivered better logical qubit performance ( 2.914 percent logical errors per cycle ) than the smaller surface code (3.028 percent logical errors per cycle)"

↑ It means one manipulation or cycle of detecting and correcting errors of a logical qubit increased the error rate by about 3.0% instead of decreasing errors (= error-worsening error-correction operation ).

This Google research paper p.4-Fig.3-b shows logical (qubit) fidelity decreased to only 0.2 (= which means error rate increased to 80% ) after 25 error correction cycles, instead of decreasing errors.

↑ This is Not an error correction but an error-worsening operation, showing quantum error correction for decreasing errors is impossible ( even in Google's latest research ).

Error rate increases by more than 2.70% per (fake) error correction operation (= which is useless ) instead of decreasing.

Also in Honneywell's 10-ion-qubit experiment, the total error rate increased by 2.70% per quantum error correction operation (= QEC, this p.4-Fig.3, p.4-right-1st-paragraph ) instead of decreasing.

So error correction for decreasing errors in today's error-prone quantum computers is impossible.

The recent Chinese research also showed the impractically-high error rate 13% (= fidelity was 0.877 ) of a logical qubit consisting 17 superconducting physical qubits ( this 4th-last-paragraph ) in just preparing a little complicated qubit's state called magic (= non-Clifford ) state where probabilities of a qubit's 0 and 1 are a little complicated ( this p.8-10 ), which is impractical.

This research paper ↓

p.3-left-Experiemntal implementation says "using 17 out of the 66 qubits on the Zuchongzhi 2.1 superconducting quantum system. This 17-qubit distance-three surface code consists of 9 data qubits, 4 X-type ancilla qubits and 4 Z-type ancilla qubits (= each logical qubit consist of 9 data physical qubits and 8 ancilla qubits )"

p.4-right-(5)-2nd-paragraph says the logical (qubit) fidelity of the magic state was 0.877 (= error rate was 13%, which was too high to be practical ).

p.4-right-3rd-paragraph says ". After the error correction procedure, the fidelity of the logical states at each point is improved and the logical error rates per round are reduced to 24.77% and 25.65%, respectively (= their error rate increased from the initial 13% ! )"

↑ It means the total error rate increased by 24~25% per error correction round ( p.5-Fig.4 shows fidelity decreased, or error rate increased per round )

As a result, the current quantum computer's error correction just increased the total error rate instead of decreasing errors due to too error-prone qubits, which is completely useless.

Ad-hoc Post-selection = physicists gave up correcting errors in the current extremely error-prone quantum computers.

Physicists gave up error correction, and instead, try to illegitimately post-select (or pre-select ) only qubits that can luckily avoid errors (= discarding all the remaining qubits due to causing errors ).

(Fig.P)  Today's quantum computers cannot correct errors, instead, they just discard erroneous qubits in illegitimate post-selection.

The current quantum computers are so error-prone that their error correction is impossible.

So physicists started to rely on illegitimate post-selection methods without error correction ( this 6th-paragraph,  this 4th-last-paragraph ).

In this ad-hoc post-selection methods, they unreasonably discard all qubits that show errors, and post-select only qubits that can luckily avoid errors without performing error correction operation ( this p.15 ).

This p.1-left-3rd-paragraph says
"which means that detected errors cannot be uniquely identified and corrected. We must therefore rely on postselection to find and discard cases where an error occurred."

This p.7-left-2nd-paragraph says
" this experiment uses post-selection rather than correction"

This post-selection method is completely impractical.

Because in large calculations, almost all qubits in the current error-prone quantum computers show errors and must be discarded.

Only a tiny percent of qubits that can avoid errors by chance remain.  ← It is impossible to calculate large numbers just by these small numbers of post-selected remaining qubits.

Microsoft, Quantinuum latest (only) 56 or 30-ion qubits (= divided into only ~12 logical qubits ) relied on illegitimate post-selection and pre-selection of intact qubits without error correction.

See this page and this page

Harvard-QuEra (deceptive) fault-tolerant quantum computer just post-selects qubits instead of error correction

See this page.

Caltech's "erasure" error correction using only two qubits also just discarded qubits that showed errors instead of correcting errors, which is impractical.

The recent Caltech and Amazon quantum erasure error correction just used only two impractical qubits with No error correction.

This research paper ↓

p.1-left-2nd-paragraph says "In this regime, the laser drives a unitary operation, U(t), that naturally results in the two atoms forming a Bell state, Ψ = gr + rg , between the ground (= g ) and Rydberg (excited) states (= r )"

↑ They used only two atoms as two qubits, each atomic two energy levels excited by laser light meant a qubit's 0 (= g ) or 1 (= r ).  ← This ground state (= g ) was the first excited state (= P ) instead of true lowest-energy ground state (= S = treated as erasure error state ).

p.1-left-last-paragraph~p.2-top says "This Bell state generation (= one atom is g state, the other atom is highly-excited Rydberg r state ) has several major practical limitations. Of particular interest here are leakage errors to the absolute ground state S, which are converted to erasure errors in our work"  ← The atomic energy levels going down to the lowest ground state (= 1S0 or a ) from g or r were treated as "erasure error".

p.2-Fig.1-right-upper says "We can discard data from pairs where atoms are detected in the erasure-error image, termed erasure excision."  ← error qubits (= falling down to the lowest-energy |a> state called erasure error ) were discarded instead of being corrected.  ← No error correction

Fig.1c shows the fidelity of the raw data without discarding erroneous qubits (= purple ) was very bad = 0.94 (= error rate was as high as 6% in just one two-qubit operation, which is impractical ).

p.2-right-3rd-paragraph says "While Bell state generation as demonstrated here is Not a computational two-qubit quantum gate"  ← This experiment just excited two atoms by laser light with No legitimate two-qubit gate operation for computation.

p.5-left-1st-paragraph says "Besides our current demonstration of about 0.999 SPAM-corrected two-qubit fidelity,"  ← still error rate was 0.001 (= 1 - 0.999 ) or 0.1%, which was much higher than practically-required error rate of 10-15, so useless.

↑ This SPAM (= state preparation and measurement ) -correction is Not a true error correction but just discarding or ignoring the erroneous or lost atomic qubits in the imaging process ( this p.7-Fig.5-upper ).

↑ The 9th paragraph of this news on this research just says
"The Caltech team claims that removing and finding flaws (= instead of correcting errors ) in their Rydberg atom system may (= just speculation ) enhance the overall rate of entanglement or fidelity. According to the latest study, only one in 1,000 pairs of atoms failed to become entangled."  ← Error rate of 1/1000 in each operation is still too high.

 

IBM quantum computers are inferior to classical computers.

See IBM 127-qubit quantum computers showed No advantage

 

Quantum computers are completely useless for calculating molecular energies or discovering drugs.

They force practical classical computer to perform most of complicated molecular energy calculations, and try to make useless quantum computer take all credits of classical computer as (deceptive) hybrid computer.

(Fig.D)  Quantum computers with small numbers of qubits and high error rates are Not useful calculators at all.  ← Quantum computer's drug discovery will never happen.

Quantum computers are too error-prone, useless for molecular energy calculation or drug discovery, contrary to hypes.

Physicists created deceptive "hybrid computer" consisting of practical classical computer and useless quantum computer with only less than 20 qubits (= still Not a quantum computer ).

The current quantum computers are useless, too error-prone to calculate any meaningful values.

This 5th-paragraph says
"Quantum processors remain too error-prone to demonstrate real-world advantages over classical approaches to solving practical problems."

This 2nd-paragraph says
"Quantum computing today provides No tangible advantage over classical computing in either commercial or scientific applications ( this 1st~7th-paragraphs )"

This is why physicists had to create (deceptive) hybrid quantum-classical method ( this p.2 ) such as variational quantum eigensolver (= VQE ), which is substantially a classical computer, and the useless quantum computer can do almost nothing.

Hybrid quantum method for molecular energy calculation such as variational quantum eigensolver (= VQE ) is actually a classical computer, No quantum advantage.

Today's quantum computers with small numbers of qubits are error-prone, completely useless for molecular energy calculation or drug discovery ( contrary to overhyped news ).

So physicists artificially created the deceptive hybrid quantum-classical methods called variational eigensolver (= VQE ), which is actually classical computer, and the useless quantum computer can do almost nothing.

This p.1-left- says
" The Variational Quantum Eigensolver (VQE) is a hybrid quantum-classical algorithm (= classical computer is necessary ) to obtain the ground state energy..
..This hybrid quantum-classical approach is deemed promising in the current era, where quantum computers are still error-prone and limited in number of qubits"  ← quantum computers are still, error-prone, useless.

The 6th, 11th paragraphs of this site say
"The VQE (= hybrid variational quantum eigensolver ) works by using a quantum computer to prepare a quantum state and measure its energy (= quantum computer just measures something with No calculation ), then using a classical computer to adjust the quantum state in an attempt to lower the energy (= classical computer calculates and finds the lowest energy )"

"Despite the promise of VQAs, there are still many challenges to overcome. One of the main challenges is the noise and errors that are prevalent in today's quantum computers. This noise can cause the quantum part of the algorithm to produce incorrect results"  ← Quantum computers are too error-prone to be useful.

The 16th paragraphs of this site says
"Today’s most popular hybrid, an algorithm known as a variational quantum eigensolver (VQE), uses classical computers to approximate a molecule's stable ground state,.. Then a quantum computer takes over to find the ground state’s precise solution. But today's error-prone quantum computers typically struggle with VQEs...
That approaches but still falls short of the classically modeled pentacene"  ← No quantum advantage.

Today's impractical quantum computer with a small number of qubits needs overhyped fake news such as "drug discovery".

To hide the inconvenient truth of useless quantum computer that cannot calculate molecular energy, the deceptive hybrid computer, which is just a classical computer, was created

(Fig.H)  Hybrid quantum-classical computer for molecular calculation is just a classical (= Not quantum ) computer.  ↓

More than millions of qubits are necessary for a practical quantum computer, which is impossible.

More than millions of qubits are necessary to achieve true quantum advantage ( this-4th-paragraph,  this 1st-paragraph ), which is impossible in today's useless quantum computer with far smaller numbers of qubits.

This 2nd-paragraph says
"It is said that a quantum computer with a scale of 1 million qubits is needed to realize ultrafast calculations that are impossible with conventional computers, such as quantum chemical calculations."

The 1st, 7th, 10th paragraphs of this news say

"But so far none have conclusively achieved quantum advantage"

".. Small is key because estimates indicate that a world-changing quantum computer will need one million to 10 million qubits"

"Companies that are exploring this technology have popped up in the past few years, including Alpine Quantum Technologies (which has a computer with 24 qubits), IonQ (which has 29 qubits) and Quantinuum (which has 32 qubits)"  ← far from millions of qubits required for quantum advantage.

Today's useless quantum computers with small numbers of qubits (< 100 qubits ) cannot calculate anything.

So this kind of news falsely claiming quantum advantage just by outputting random useless numbers using only 56 error-prone qubits is overhyped fake news.

All news about quantum computers for molecular energy calculation or drug discovery is just hype, and fake.

As I said, today's quantum computers are too error-prone to calculate any molecular energies or discover drug.

So they fabricated deceptive quantum-classical hybrid computers, which are substantially classical computers (= with billions of errorless bits ), and the error-prone quantum computers with only small numbers of qubits (= just 6 ~ 12 qubits,  this p.2-Fig.1 ) can calculate nothing.

This IBM-Cleveland clinic's alleged 7 amino acid calculation (= based on the deceptive hybrid method heavily relying on classical computer ) used only less than 10 impractical qubits (= each qubit can take only 1 or 0, so this is just 10 bitstring, which is still Not a quantum computer ).

Recent Chinese quantum computer for drug discovery pipeline is overhyped fake news.

Chinese hybrid quantum computer with only 2 impractical qubits is substantially a classical computer, No quantum advantage.

(Fig.T)   Just 2 ~ 8 error-prone bits or qubits is Not a (quantum) computer at all.

The 1st, 6th, 11th, 15th paragraphs of this hyped news say

"A team of Chinese researchers report that quantum computers may one day (= just speculation ) serve as the backbone of a drug pipeline that could potentially (= just speculation ) revolutionizing drug design, which could transform the pharmaceutical industry."

"However, the researchers noted that quantum computing's role in drug discovery has primarily been limited to conceptual validation, with minimal real-world integration."  ← quantum computer is still impractical.

"In the study, the team used a hybrid computing method (= which is a classical computer ), starting with a quantum emulator (= Not a quantum computer ) before moving to a quantum computer, to simulate the drug’s interaction with its target."

"Despite its promising report, using quantum computers for current drug discovery faces significant limitations. For example, quantum computing is still plagued by longer computational times and errors, which impede its accuracy and efficiency in drug discovery."  ← Quantum computers remain useless after all.

Only 2 impractical qubits, which is Not a quantum computer.

This Chinese research paper ↓

p.2-Figure 1 needed classical computers (= as the deceptive hybrid computer ), so Not quantum computer research.

p.3-9th-paragraph says "Despite that quantum devices with more than 100 qubits are becoming available, simulating large chemical systems would require very deep circuits, which will inevitably lead to inaccurate outcomes due to intrinsic quantum noise"  ← quantum computer is too error-prone to be useful

"..Thus, it is ofen desirable to reduce the effective problem size of chemical systems..
The wave function of the active space can then be represented by a 2-qubit (= only 2 qubits, Not a quantum computer ) superconducting quantum device...
the parameterized quantum circuit for VQE (= classical computer pretending to be hybrid )"

p.4-Figure.3 used only 2 qubits (= each qubit can take only 1 or 0, so this is just impractical 2 bitstring = 00, 01, 11, Not a quantum computer )

p.9-5th-paragraph says "While our study employed classical pre-optimization instead of parameter optimization on quantum computers due to associated overhead"  ← Ordinary classical computer was used for calculating important parameters.

Quantinuum used only 3 ~ 8 impractical ion qubits for fake simulation of a H2 molecule.

The current deceptive quantum computer's molecular energy calculations were done by classical computers as (deceptive) hybrid computer.

The 3rd, 5th, 7-8th paragraphs of this recent hyped news say

"Quantinuum claims this is an “essential step” to speed up the discovery of molecules and reduce the time to generate commercial and economic value in the future (= still unrealized )."

"However, these quantum machines are so sensitive that their calculations suffer from errors. Imperfect control signals, interference from the environment and unwanted interactions between quantum bits – qubits – can lead to "noise" that disrupts calculations."  ← quantum computers are too error-prone to give right answers."

"In a preprint study, the team claims to have overcome the challenge by using an error-detection code which saves quantum resources by immediately discarding calculations if the code detects qubits that have produced errors."  ← discarding erroneous qubits post-selectively instead of quantum error correction.

"The team used this code to make three logical qubits (= only 3 qubits or 3 bitstring 001, which bit number is too small to calculate any molecular energy )"

Quantinuum was only 3 ~ 8 impractical qubits, and all calculations were conducted by ordinary classical computer.

This Quantinuum research paper ↓

p.4-left-III-1st-paragraph says
"In this work,.. encode four logical qubits... leading to eight qubits used in total (= just 4 ~ 8 qubits can Not calculate any molecular energy )."

p.5-left-last-paragraph says ".. to calculate the coefficients of the spin Hamiltonian (1) on classical computers. The classical pre and post-processing for parameter selection (= almost all molecular energy calculation was done by classical computer, Not by the impractical quantum computer with only 3 ~ 8 ion qubits )"

Quantum computers cannot simulate chemical reaction at all.

Just one impractical ion qubit alone can Not simulate chemical reaction nor photosynthesis in conical intersection, contrary to hypes.

(Fig.P)  Just one useless ion trapped in Paul trap (= external electromagnetic field) has nothing to do with simulating chemical reaction, photosynthesis nor quantum computing, contrary to the overhyped fake news.  ↓

Various news sites spread overhyped news that researchers used a quantum computer to observe some chemical reaction called conical intersection related to photosynthesis.

But this is completely false.
Quantum computer is completely useless and unable to simulate any meaningful chemical reactions.

This research paper ( this ↓ ).

p.2-Figure 1-g says "in an ion-trap quantum simulator with a single 171Yb+ ion (= only one single ion qubit 0 or 1, Not a computer or simulator at all )."

p.7-Experimental setup says "The 171Yb+ ion (= only one ion ) is confined in a Paul trap.. We use two laser beams derived from a 355 nm pulsed laser to coherently control the qubit"

↑ This research just trapped only one single (impractical) ion by controllable external electromagnetic field called Paul trap, and manipulated the ion by two laser light.

That's all. No quantum computing nor simulation of chemical reaction was done, much less photosynthesis (= just one single bit or ion can Not calculate nor simulate anything ).

Conical intersection is a very vague (useless) concept of two energy states intersecting ( this p.8 ).

Quantum computer cannot identify single nucleotides.

Nucleotides were distinguished by electric conductance measured by electrodes, Not a quantum computer with only 3 impractical qubits.

The 4th, 6-7th paragraphs of this hyped news say

"the researchers used a quantum computer to distinguish adenosine from the other three nucleotide molecules (= false, Not a quantum computer but electrodes measuring electric conductance through nucleotides distinguished or identified the kinds of nucleotides )."

"The researchers used electrodes (= Not a quantum computer ) with a nanoscale gap between them to detect single nucleotides. The output of current versus time for the adenosine monophosphate nucleotide differed from that of the other three nucleotides"

"Variations in (electric) conductance depend on molecular rotation patterns that are unique for each nucleotide"

↑ This research tried to connect this electrode with only 3 impractical (superconducting) qubits with high error rates.

As shown in this p.15-2nd-paragraph and p.16, the success rate of identifying nucleotides by these 3 impractical qubits (= connected to electrodes ) was only 77.8% (= error rate was 22.8% ), which was worse than the success rate 95% of the quantum simulator (= which is classical computer ) connected to electrodes.

↑ Quantum computer was useless with No advantage.

Quantum computer is completely useless for health care.

Contrary to an incredible amount of overhyped fake news, today's quantum computers with small numbers of qubits (= only 5 bits ) and a lot of errors are far from practical in health care.

(Fig.H)  Today's quantum computer is useless, too error-prone with too small numbers of qubits to use for health care, contrary to overhyped news.

Today's quantum computers are useless for health care and drug discovery"

Contrary to an incredible amount of overhyped fake news, today's quantum computers are completely useless, too error-prone with too small numbers of qubits to use for health care or medicine.

So corporations and academia try to combine ordinary practical classical computers with still-impractical quantum computers as (deceptive) hybrid computers to make the hopeless quantum computers look promising in healthcare or drug development in hyped news.

Inferior useless quantum computers need practical classical computers as deceptive hybrid computers.

This 1st~2nd paragraphs say
"Although quantum processors exist today, they are still a long way off from becoming practical replacements for classical computers... IBM has now pitched the idea of a hybrid quantum-classical computer."

"IBM readily admits that nobody has yet demonstrated quantum advantage"  ← Quantum computer advantage is fake.

This-lower-Noise and error correction says
"To address this challenge the industry is exploring hybrid approaches of combining classical and quantum computing (= this is substantially a classical computer )"

This p.5-2nd-paragraph-lower says
"However, implementing quantum computing in drug discovery faces practical challenges... To address this, researchers are developing hybrid quantum-classical algorithms, such as the Variational Quantum Eigensolver (VQE,  this 3rd-paragraph )."

No advantage of quantum computer, which is still Not a computer.

This p.2-4th-paragraph says "Specialized quantum computing systems allow thousands of physical qubits, but this is still short of quantum advantage and does not make classical computing redundant"

This Impact of quantum computing on healthcare and biomedical research says
"However, it remains an open and crucial question as to whether QCs (= quantum computers ) can more efficiently solve problems that are intractable for classical computers"  ← No quantum computer advantage.

The 2nd-last paragraph of this overhyped news also admits
"However, we still have to wait a lot until any of this can be applied in real life, so this is rather science fiction at the moment"  ← quantum computer for healthcare is still science-fiction.

↑ Quantum computers useless for any applied science such as healthcare need overhyped fake news ↓

Examples of overhyped fake news trying to make hopeless quantum computers look promising in healthcare.

The research shown in the 2nd-paragraph (= Mathematics ) of this hyped news did Not use quantum computers, contrary to hypes.

This research paper mentioned in the upper news ↓

Abstract says "To execute the grid search, our research employed a classical optimiser (= this research used a classical computer, Not a quantum computer, contrary to hyped news )"

This p.5-3.2 (= hybrid quantum-classical algorithm ) p.7-Figure.2 and p.8-4.2 (= Experimental design ) says "The authors proceeded to train a VQC (= hybrid classical-quantum ) as a classifier within Qiskit (= classical computer simulator of qubits, Not a quantum computer ) machine learning principles."

↑ As a result, contrary to various hyped news, No quantum computers (= error-prone and having only small numbers of qubits ) were useful for molecular energy calculation nor health care.

They relied on classical computers for almost all calculations as the deceptive "hybrid" computers.

Quantum computer is completely useless for Alzheimer's diagnosis.

(Fig.A)  ↓ Quantum computer's diagnosis of Alzheimer is just hype and fake.

Quantum computers with small numbers of qubits cannot diagnose Alzheimer, contrary to hypes.

Insider Brief, the 4th-last~last paragraphs of this hyped news on the alleged quantum computing for Alzheimer's diagnosis say

"A hybrid classical-quantum (= which is substantially a classical computer, and the useless quantum computer could do almost nothing ) approach might (= just speculation, still useless ) boost the accuracy and efficiency of Alzheimer’s disease diagnosis."

"The experiments were conducted using a Hewlett Packard Core i5, sixth-generation computer with 8 GB RAM, and a Google Colab Pro GPU (= this research used ordinary classical computers for almost all calculations )."

"..On the quantum side, the researchers relied on a 5-qubit (= just 5 qubits or 5 bitstring 01101 is useless, unable to calculate anything, Not a quantum computer at all ) quantum hardware or simulator (= using classical computer for simulating imaginary quantum computer ), employing the QSVM model from the Qiskit library"

"However, the researchers acknowledge the need for further studies to evaluate the practical implementation of this model within medical devices (= still impractical )"

Only 5-qubits is useless, Not a quantum computer.

This research paper on the dubious quantum computer on Alzheimer ↓

p.3-Methods say "In this paper, we introduced a method based on ensemble learning and quantum machine learning classification algorithms (= quantum algorithms do Not mean the use of quantum computers ) that analyze MRI brain images and extract meaningful features for successful classification of AD stages"

p.8-Conclusion and recommendation say "We used 5 qubit quantum hardware (= just 5 qubits is useless, Not a computer ) or simulator (= classical computer simulator of imaginary quantum computer was also used ) and we utilized the QSVM model from the Qiskit library "

↑ This research used only 5-qubit quantum computer (= each qubit takes 0 or 1, so 5-qubits can express only 25 = only 1 ~ 32 numbers which can be easily calculated much faster by the ordinary classical computer, so quantum computer is completely unnecessary for this alleged Alzheimer diagnosis ).

Actually this research also conducted the classical computer's simulation of the imaginary 5-qubit quantum computer by Qiskit algorithm of QSVM (= quantum support vector machine,  this p.7,  this p.27-2nd-paragraph~ simulators ).

This classical computer simulator of the imaginary quantum computer is usually far better than the actual (useless) error-prone quantum computer ( this p.6-right-Table 4 shows the IBM QASM classical computer simulator was much faster than the IBMQ_16_Melbourne quantum computer ).

As a result, the quantum computer with only a small number of qubits (= only 5 ) is completely useless for Alzheimer diagnosis, contrary to the overhyped news.

Quantum AI for Alzheimer's disease early screening (← ? ) is just hype.

This recent overhyped research (2024) ↓

p.11-3.2 says "Due to limited accessibility to real quantum hardware, computation of kernels is simulated without noise on a classical hardware (= an ordinary classical computer was used in this research ), using tools from Qiskit library."

p.11-3.2 also mentions this research used only 6 ~ 12 qubits (= one qubit can take only 0 or 1 values, so just 12 qubits or 12 bitstring is still Not a computer ) simulated by classical computer (= Qiskit ), Not executed by today's (error-prone, useless) quantum computers.

There is No such thing as a quantum simulator.

Physicists just oscillated small numbers of atoms or ions disorderly with a lot of errors, and called them (fake) quantum simulators that are useless.

It is impossible for today's error-prone quantum computers with only small numbers of qubits to calculate or simulate some meaningful physical phenomena.

So all the dubious news about the alleged quantum simulators is just fake, untrue and useless.

 

All types of qubits are hopeless, impractical forever.

(Fig.Q)  All types of quantum computers are extremely error-prone, unstable and impossible to scale up.

Photon quantum computer is most error-prone, impractical forever

See this page.

Other qubits

Most major companies such as Google and IBM focus on another-type quantum computer using superconducting qubits (= just classical circuits ) which also cause impractically-high error rates ( this 5th-paragraph,  this p.1-introduction-2nd-paragraph ), and the necessity to always cool their bulky qubits (= each qubit is as big as millimeter ! this 3rd-last-paragraph ) to almost absolute zero makes it impossible to realize the (imaginary) practical quantum computers that will need millions of qubits ( this 4th-paragraph,  this 4th-paragraph ) forever.

D-Wave annealing machines for optimizing problems are Not true quantum computers nor faster than classical computers.

Topological quantum computer trying to use an illusory quasiparticle with fractional-charge called anyon or Majorana is unreal, fraudulent and impractical forever, Not achieving even a single qubit yet.

↑ The latest topological quantum computer is fake and unable to realize the true fault-tolerant quantum computer based on imaginary braids (= unseen non-existent braids ) at all ( this 9~10th paragraphs ).

Fluxonium qubits are useless.

MIT's new fluxonium qubits (= only 2 qubits !) are still error-prone, far from a (illusory) practical quantum computer that will need at least millions of qubits.

The current leading quantum computer companies such as IBM and Google use superconducting quantum bits or qubits (= which consist of just classical circuits, ) called transmon ( this p.6-left-1st-paragraph,  this 14th-paragraph ) whose error rate is the lowest (= still high, though ) among all types of qubits.

↑ But even this "best" transmon qubit suffers extremely high error rates, whose two-qubit gate error rate is 0.5 ~ 1% or 0.005 ~ 0.01 (= fidelity is 99.5 ~ 99 % ), which error rate is much higher than the current ordinary practical classical computers whose error rate is only less than 10-17 ( this 2nd-paragraph,  this 17th-paragraph,  this last-paragraph ).

The recent (hyped) news claimed that MIT researchers created (only) two qubits of new type of superconducting qubit called "fluxonium ( this Fig.1 )" whose two-qubit-gate error rate is 0.001 (= 99.9% accuracy or fidelity,  this 6th-paragraph ), which is still far worse than classical computer and the (illusory) practical quantum computer whose error rate must be less than 10-15 ( this 5th-paragraph,  this 3rd-paragraph,  this 7th-paragraph ), and the practical quantum computer will need at least millions of qubits ( this 19-21th paragraphs ) which is impossible to realize forever.

↑ In fact, this fluxonium qubit studied by MIT is Not new. However, No quantum computer companies wanted to replace their conventional transmon qubit by this fluxonium qubit ( this 2nd-last~3rd-last paragraphs,  this p.1-right-2~3rd paragraphs ).

Because this allegedly-new-type of fluxonium qubit is much more complicated ( this 5th-paragraph ) and much slower than the conventional transmon qubit, hence scaling up or increasing the number of these fluxonium qubits is impractical and impossible ( this 2nd-last-paragraph,  this-last~10th-last paragraphs,  this p.2-left-2nd-paragraph ).

↑ The time needed to operate two-qubit (= CZ ) gate in fluxonium qubits is about 2~3 times longer (= which means 2~3 times slower than conventional transmon qubit.  fluxonium two-qubit's gate time is 85 ~ 200 ns,  this last paragraph,  this p.7-left-2nd-paragraph,Fig.4,  this p.1-right-lower ) than the conventional transmon qubits whose gate time is 20~35 ns ( this p.2-left,  this Fig.1b,  this p.2 ).  ← No advantage, and hopeless situation of quantum computers is unchanged.

Ion qubits are hopeless.

Ion-qubit quantum computer of Quantinuum is extremely slower and unable to be scaled up to the practical computer.  ← useless forever.

Trapped-ion quantum computers using unstably-floating ions (= two energy levels ) as qubit are known to be very slow, impractical, still error-prone and impossible to scale up.  = still less then only 50 ion qubits ( this 11th-paragraph ).

The recent hyped news claimed that Quantinuum's H1 ( consisting of unstably-floating trapped ion qubits ) might successfully excecute fault-tolerant algorithm (= untrue ).

↑ Even in this Quantinuum latest ion-qubit computer consisting of only less than 20 (ion) qubits (= fall far short of million-qubit practical computer ), its error rate is extremely high (= each 2-qubit gate error rate is 2 × 10-3 or 0.13 %,  this p.3 ), and the error rate even after the so-called "(deceptive) fault-tolerant" operation is still much higher (> 1.0 × 10-3,  this 6th-paragraph ) than the error rates required for the practical computer (= practical computer's error rate must be less than 10-15,  this abstract,  this p.1-left-lower ).  ← still far from practical use.

This ion-qubit quantum computers adopted by Quantinuum and Honeywell can never be put to practical use like all other hopeless quantum computers.  ← Physicists have already known this inconvenient fact, but had No choice but to hide this disastrous truth under the current unrealistic dead-end mainstream physics.

First, it is impossible to scale up or increase the number of ion qubits to the practically-required million qubits, because precisely manipulating many unstably-floating ions (whose two energy levels are used as the qubit 0 and 1 states ) by many lasers is unrealistic ( this 12-13th-paragraphs,  this p.1-intro-1st-paragraph, this 2nd-paragraph,  this trapped-ion qubit section ).
Even the latest quantum computer consists of only less than 50 (ion) qubits ( this 4th-paragraph ).

Second, this ion-qubit quantum computer is more than 1000 times slower (= execution time is far longer ) than the current dominant superconducting qubits used by Google and IBM ( this p.4-right-3rd-paragraph ).

The gate time required for performing each two-qubit gate operation in ion qubits is longer than microseconds or μs (> 1000 ns,  this p.6,  this 2.2,  this p.2-left-1st-paragraph,  this p.5-right-lower ), which is far slower and longer than the gate time of superconducting qubits (= less than 50ns,  this p.2-left,  this Fig.1,  this p.11-Table.4 ).

Furthermore, this latest Quantinuum's H1 ion-qubit quantum computer could Not correct errors, instead, they just artifically "post-select" results for seemingly removing errors ( this p.7-left ), which ad-hoc method is inapplicable to the practical larger computer.

Even in the latest (exaggerated) research in 2023/11/29, they used only 51-ion-qubits (= this 5th-paragraph, ) with extremely high error rates = the error rate of preparing the initial state is as high as 25% (= fidelity is 0.75 ) and each two-qubit error rate is 2% (= fidelity is 0.98,  this p.8-appendix C,  this-middle two-qubit gate error ), which is far from the practical quantum computer requring million qubits and less than 10-15 error rate.

Also in the latest research in 2023/12, they could achieve only (impractical) 32 ion qubits with a lot of unsolved issues ( this 1st, 9-10th-paragraphs ).

↑ This latest 32-ion-qubit quantum computer's error rate is too bad (= 40% or fidelity is 0.6 ) in some random-number generating operations compared to the error-free practical classical computer ( this p.9-Fig.8 ).

As a result, the current hopeless situation of impossibility of practical quantum computer is unchanged (forever).

Quantum computer is deadend, still having only one impractical ion qubit (= Not a computer at all ).

The 1st, 4th, 10th, last paragraph of this hyped news say

"Researchers at ETH have managed to trap ions using static electric and magnetic fields and to perform quantum operations on them. In the future (= just speculation, still useless ), such traps could be used to realize quantum computers with far more quantum bits"

"In this way, it has been possible in recent years to build quantum computers with ion traps containing around 30 qubits (= far less than millions of qubits required for useful quantum computer ). Much larger quantum computers, however, cannot straightforwardly be realized with this technique (= deadend ). The oscillating fields make it difficult to combine several such traps on a single chip,"

" A single trapped ion (= only one ion qubit, which is Not a computer at all ), which can stay in the trap for several days, could now be moved arbitrarily on the chip, connecting points "as the crow flies" by controlling the different electrodes"

"As a next step, Home wants to trap two ions (= meaning this research just trapped only one ion qubit = Not a computer at all ) in neighboring Penning traps on the same chip.. This would (= just speculation ) be the definitive proof that quantum computers can be realized using ions in Penning traps (← false. Several ion qubits alone cannot prove quantum computers )"

Trapping only one ion (= only one bit ) cannot make a quantum computer that will need millions of qubits, and their research is already deadend with No progress.

↑ This research just trapped only one Be+ ion (= whose two energy levels were used as a bit or qubit's states 0 or 1 ), and moved it a little by controlling external electromagnetic field in Penning traps, and No quantum computer was realized, contrary to the overhyped headline "larger quantum computers".

Ordinary ion quantum computer (= still Not a computer ) can Not have or confine more than 32 ion qubits to one chip or one trap area, hence, they have to inconveniently move each ion to another Penning trap (= impractical, taking much time ) to realize interactions between more ions ( this-13-14th-paragraphs,  this p.1-left-1st-paragraph,  this p.1-left ).

↑ Even this latest research could trap only one ion (= just only one quantum bit or qubit 0 or 1 ) in very large bulky trap chip (= one qubit size is as large as 100μm, ~ 1mm far bigger and bulkier than the ordinary classical computer's compact bit or transistor of only 40nm size ), and move it slightly (= about 50μm ) slowly (= each motion took 4 milliseconds ) at cryogenic temperature (= 6.5K = impractical ) with No quantum computer's calculation.

Practical quantum computer is said to need more than millions of qubits ( this-1st-paragraph,  this-2nd-paragraph,  this-3rd-paragraph ), so the current ion-trap quantum computer, which can Not have more than 30 qubits, is completely hopeless like all other quantum computers.

This research paper ↓

p.1-right-last-paragraph says "a single beryllium Be ion confined in electric trap (= only one impractical ion qubit )"
p.2-Fig.1 shows one ion qubit is very big and bulky (= more than 100μm )
p.4-left-2nd-paragraph says "an ion was transported (= about 50μm ) in 4ms"
p.6-left-lower says "temperature of 6.5 K."

↑ Despite longtime researches across the world, practical quantum computers that will need millions of qubits are still a long way off (= which means unrealized forever ), and their quantum computer researches are regressing to only one (impractical) qubit instead of progressing to larger numbers of qubits with less errors.

 

Spin qubits are deadend.

Spin-qubit silicon-type quantum computer is extremely impractical, unstable, impossible to scale up.

Silicon-type quantum computers allegedly using a tiny electron spin dot as a qubit is far more impractical than other-type qubits.  ← They only achieved highly-impractical 12-qubits, which fall far short of the practical quantum computer requiring millions of qubits ( this-lower Disagreement with Intel,  this 10th-paragraph ).

↑ This silicon-type quantum computer (= still Not a computer at all ) just measures the atomic energy state or magnetic field ( this Fig.4 ) interacting with light ( this Fig.4 ), they can Not directly measure the (fictional) electron's spin itself.

Even the latest published spin-qubit quantum computer's research used only six electron qubits (= correctly, only two qubits consisting of six quantum dots,  this 4th-last paragraphs,  this Fig.2 ), which is far from the practical quantum computer needing more than million qubits ( this 2nd-last-paragraph,  this 3rd-paragraph ).

↑ And their error rate of two-qubit CNOT gate is impractically high = 3.7 % (= 3.7 × 10-2 = 100 % - fidelity 96.3 %,  this-abstract, p.4-Fig.4,  this 3~5th-paragraphs ), which is also far from the practically-required error rate of 10-15 ( this 3rd-paragraph,  this last-paragraph ).

The latest single-electron charge qubit ( = each electron's motion pattern is used as quantum bit's state 0 or 1 ) also has impractically high error rate (= single qubit's read out error is about 2% = 100-98% fidelity, this p.2-left ), and they did Not state the two-qubit error rates because the stable two-qubit gate operation is impossible.

Neutral atomic qubits are hopeless.

Cold atoms trapped in laser light or optical lattice are extremely unstable, easily lost and hard to manipulate, completely impractical.

Some groups such as Harvard try to exploit neutral atoms unstably trapped in laser light or optical lattice as quantum bits or qubits.

Each bit or qubit's 0 or 1 states are expressed by the neutral atomic two energy states or hyperfine states (= 0 or 1 depending on the atomic nuclear magnetic directions up or down with respect to electron's orbit ).

The problem of this atomic qubit is that physicists cannot control each atom precisely.

When they try to load atoms onto the designated optical lattice trap places, their success trapping rate is only 50% (= 50% of all the places remain empty,  this p.1-right,  this p.1-left-2nd-paragraph ).

And they cannot precisely control the places where atoms are loaded.  Bad survival rate ( this p.2-Fig.3B, p.3-Fig.4D ).

Fragile neutral atoms have to be kept at almost absolute zero temperature, which is impractical ( this-last-paragraph,  this p.6-left-1st-paragraph-10μK,  this p.7-1-2nd-paragraphs-1mK ).

Even at such an extremely low temperature, atoms frequently drop out of the laser light optical lattice and disappear within 10 seconds ( this-middle-Components in detail,  this p.8- 4-5th-paragraphs ), which cannot be used as stable memory bits.

When reading each atomic bit state, physicists illuminate atoms with laser light with some wavelengths that can distinguish two hyperfine energy states, so that only one of two states is excited to the higher level, and emits the detectable fluorescence light.

↑ Even in this process of reading each atomic state, atoms hit by the probe light are often removed from the optical lattice ( this p.3-C. Atom Loss,  this p.9-right-2nd-paragraph ).

↑ There are a few cases of non-destructive detection methods, but they are limited to only less than 10 atoms or 10 qubits with high error rates ( this p.1-left, p.9-left ).

It takes impractically-much time (= milliseconds ~ seconds ) to reload new atoms to the optical trap, and each operating time of an atom is longer and slower than other qubits ( this 16th-paragraph,  this p.3-right-last-paragraph ).

In two qubit operation, physicists have to excite each atom to very high energy state or big atomic radius called Rydberg state which state is extremely short-lived ( this p.13-3~4th-paragraphs ) and impractical with high error rate or low fidelity ( this-6th-paragraph ).

Error correction of atomic qubits is also impossible.

 

Quantum computer's overhyped news

Atomic clock has nothing to do with ( error-prone ) quantum computers.

The 2nd, 7th, 8th, 9th paragraphs of this hyped news say

"Researchers recently explored the possibility (= just uncertain possibilities ) of using quantum computing techniques to further improve the performance of atomic clocks. Their paper, published in Nature Physics, introduces a new scheme that enables the simultaneous use of multiple atomic clocks to keep time with even greater precision (= lie, this research's error rate is far worse than ordinary atomic clock, and this research has nothing do to with quantum computers )."

"The researchers used their proposed technique to control individual atoms in atomic clocks (= unlike the practical atomic clocks treating many mixed atoms simultaneously, controlling individual atoms in this research caused impractically-worse errors ). Specifically, they ensured that each atom effectively experienced the passing of time slower or faster, depending on their dynamic position in relation to the applied laser beam."

"The current most precise clocks in the world work by measuring the passage of time with a large ensemble of atoms, but we demonstrate that individual control could lead to better performance (← untrue, controlling an individual atom or an unstable atomic qubit shows a far larger error rate ) More generally, our work shows the power of combining the features of quantum computers and quantum sensors (← untrue, this research has nothing to do with quantum computer nor sensors )"

"The initial findings.. are very encouraging, highlighting the potential (= just uncertain potential, still useless ) of quantum computing techniques in metrology research. In the future (= just speculation ), this study could inspire the development of other programmable quantum optical clocks"

↑ This research just used only two independent Sr atoms (= Not a quantum computer that will need a lot of atoms or qubits ) interacting with laser light, and measured the atomic energy levels' change with error rate far worse than the ordinary atomic clock, so useless research.

In the ordinary atomic clock, the wavelength or frequency (= energy ) of the mirowave is adjusted to agree with the atomic two energy levels' interval by illuminating a collection of atoms with various microwaves.

↑ The error rate of manipulating an atom in this research is far higher and worse (= fidelity = 1 - error rate is 0.984 = error rate is 0.016,  this p.3-left-last = average fidelity is 0.984 in this research ) than the ordinary atomic clock error rate of only 10-17 ~ 10-18.

↑ So this research is useless not only for (error-prone) quantum computer but also for atomic clock.

Quantum computers must manipulate each atom, which causes extremely large error rate inapplicable to precise atomic clocks (= so this news claiming such error-prone quantum computers might make time more precise is a total lie ).

To use each single atom or an ion for atomic clock, they have to usually take extremely much time = about 10 days ~ months ( this future plan,  this p.3-abstract ) at extremely-low temperature to estimate the precise average atomic energy frequency or time, which is completely impractical.

Actually, this research paper's p.1-abstract-last just says
"Our results demonstrate the potential (= just potential, still unrealized ) of fully programmable quantum optical clocks (= No mention of quantum computer nor practical atomic clock ) even without entanglement (= No entanglement means this research did No two-qubit operation required for computer's calculation ) and could be combined with metrologically useful entangled states in the future (= just speculate uncertain future )."

The world's largest 1000-atomic-qubit quantum computer is meaningless and useless.

Quantum computers are still error-prone and far from practical computer that will need millions of qubits.

The current world's largest quantum computer is said to be Atom Computing company's 1180 atomic qubits or IBM's 1121-superconducting qubits, but neither of which disclosed the detailed performance of the dubious machines ( this 1-3rd,12th-paragraphs  = D-Wave is Not a real quantum computer, so excluded ) due to their still-impractical error-prone quantum computers.

This 7~10th paragraphs say
"For quantum computers to dethrone today's best classical computers, however, scientists would still need a quantum processor with millions of qubits (= 1000 qubits can do nothing )."

"And that's a problem, because qubits are notoriously error-prone and need to be kept at near absolute zero (= energy-inefficient, impractical ).. One in 1,000,000,000,000,000,000 (billion billion) bits in conventional computers fails, while the failure rate in quantum computers is closer to 1 in 1,000 (= quantum computer is far more error-prone than ordinary classical computer )."

↑ So the current largest quantum computer (= still Not a computer nor calculator at all ) is far from the practical quantum computer that will need millions of qubits and far less error rates.

Quantum computers are said to calculate faster by exploiting ( still-unproven, fictional ) parallel-universe (= or superposition ) simultaneous computation.

But the recent research showed ordinary classical computer could calculate much faster and more accurately than the error-prone (impractical) quantum computer ( this 6th-paragraph ) even if classical computer was unfairly forced to emulate (still-unproven) quantum mechanical superposition or parallel worlds.

The latest largest 1305-atomic qubit quantum computer is still error-prone and useless.

The latest news claims researchers at TU Darmstadt have first created the world's largest 1305-atomic-qubit quantum computer whose properties were published in the peer-reviewed journal (= Actually, their largest quantum computer is still Not a computer, because they could calculate nothing by using it in this research ).

The 2nd, 4th, 7th paragraphs of the hyped news about this alleged world's largest quantum computer say

"Quantum processors based on two-dimensional arrays of optical tweezers, which are created using focused laser beams, are one of the most promising technologies for developing quantum computing and simulation that will (= uncertain future ) enable highly beneficial applications in the future (= just speculation, still useless quantum computer )."

"the team reports on the world's first successful experiment to realize a quantum-processing architecture that contains more than 1,000 atomic qubits (= still far from millions of qubits required for practical computer ) in one single plane."

"A total of 1,305 single-atom qubits were loaded in a quantum array with 3,000 trap sites and reassembled into defect-free target structures with up to 441 qubits."  ← So actually, they created only 441 atomic qubits (= Not 1000 qubits ) that were placed in the "designated" positions (= even arranging atoms or qubits in desired positions is still impossible, so useless ).

↑ In the atomic-qubit quantum computer, they try to trap multiple cold neutral atoms (= each atomic two energy levels are used as a bit's 0 or 1 states ) in laser light or optical lattice tweezer.

↑ The most serious problem which makes atomic-qubit quantum computer impractical is that those atoms easily disappear and drop out of (unstable) laser optical lattice trap ( this technical challenge ).

This research paper (= this p.3-left-results, p.4-left-1st-paragraph ) says
"an average loading efficiency of approximately 40% (= only 40% of atoms are trapped or loaded in laser light as qubits, and 60% atoms are lost )"

"With supercharging, the cumulative success probability increases to a value of 35% (= success rate of putting atoms onto right places is only 35%, and 65% error rate !  this-upper-figure-(a)-red line ), still not exhibiting saturation at the maximum implemented number of 50 assembly cycles"

↑ This research on the alleged world's largest atomic-qubit quantum computer (= still impractical, and erorr-prone ) did Not perform any calculations nor qubit-operation.

According to the recent unpublished Atom Computing's paper, they had to continuously load new atoms onto optical lattice at time intervals of 300ms (= the frequently-lost atomic qubits cause errors and cannot be used as practical memories ), because atoms or qubits constantly disappear with lifetime of less than only 30 seconds ( this p.6-right-Appendix B,  this p.9-2nd-paragraph ).

↑ In spite of this frequent loading of new atoms, still 1% atoms remained lost in only 1225 atomic qubits (= far from the practical millions of qubits, this 1st, 4th-paragraphs,  ← Atom Computing's paper does not show success probability or error rates of loading atoms ).

↑ Atom Computing's quantum computer also did Not calculate anything nor operate qubits due to focusing all of their time and energy only on atomic qubit loss ( this-abstract ).

As a result, quantum computers are already dead and hopeless with No progress, contrary to an incredible amount of overhyped news.

Hyped topological quantum computer with fictional anyon quasiparticle  in 2024.

Research on topological quantum computer with fictional non-Abelian anyon quasiparticle, which was allegedly less-error-prone (= wrong, actually very error-prone ), appearing in arxiv preprint a year ago was published in Nature recently (= 2/14/2024 ).

First of all, this topological quantum computer is just fiction and scientificially meaningless except for publishing papers in academic journals.

In topological quantum computer, fictional quasiparticles such as anyon (= Not real particle, this 3rd-paragraph ) and Majorana with fractional charge are said to entangle with each other by using fictional unseen braids (= Not physical string, this p.3-middle-2nd-paragraph ), and this nonphysical tied braids of anyon quasiparticles are said to make topological quantum computer robust or less error-prone (= only inside unproven imaginary mathematical theory ).

There is No evidence that such (fictional) anyon quasiparticles or magical braids (= Not real particle or braid, but just nonphysical math thing, this 10th-paragraph ) may exist.

This recent alleged non-Abelian topological quantum computer published in Nature just used 27 trapped ion qubits (= still far from millions of qubits required for practical quantum computer ), which are Not the authentic (fictional) anyon quasiparticles, and they are still error-prone (= due to No real magical braids ) and useless.

This 9th-paragraph says
"However, some researchers claim that Quantinuum has Not actually created non-Abelian anyons. They argue that the firm has instead merely simulated them ( this 2nd-last-paragraph )."  ← They tried to pretend that 27 ions ( this p.2-left-2nd-paragraph, p.4 ) were fictional anyon quasiparticles.

This last-paragraph says
"All in all, there are still questions on whether the "credible path to fault tolerant quantum computing" was unlocked ( this 10-13th-paragraphs )"  ← still No evidence of less errors

This 27-ion-qubit quantum computer (= still Not a computer ) pretending to be (fictional) braided anyon quasiparticles ( this p.27-30 ) still has very bad and high error rates of 35% (= 1.6% error rate per single qubit ) or fidelity (= 1 - error rate ) of 0.65 (= 0.984 per qubit, this p.12-A14,A15 ), which is far worse and higher than the error rate of 10-15 that is needed to be useful quantum computer.

Quantum computer is deadend, regressing to only one useless qubit, which is far from the practical millions of qubits.

The 3rd, 4th, 13th, 14th paragraphs of this hyped news say

"In a paper published in Nature Communications, the engineers describe how they used the 16 quantum 'states' of an antimony (= Sb ) atom to encode quantum information (= actually only two energy levels out of 16 splitting energy states of one Sb atom could be used as only one qubit's 0 and 1 in this research, this p.7-Gate fidelities,  which is still Not a computer )."

"Antimony is a heavy atom that can be implanted in a silicon chip, replacing one of the existing silicon atoms. It was chosen because its nucleus possesses eight distinct quantum states, plus an electron with two quantum states, resulting in a total of 8 x 2 = 16 quantum states, all within just one atom (= only one qubit ). Reaching the same number of states using simple quantum bits—or qubits (= this experiment used only one Sb atom or one qubit, which is Not a quantum computer that will need millions of qubits )"

"The quantum computers of the future will (= just speculation, still useless ) have millions, if not billions of qubits (= but this experiment used just one impractical atomic qubit ) working simultaneously to crunch numbers and simulate models in minutes that would (= just baseless speculation ) take today's supercomputers hundreds or even thousands of years to complete (= which overhyped description contradicts the latest result proving classical computer outperforming quantum computer )"

"While some teams around the world have made progress with large numbers of qubits, such as Google's 70 qubit model or IBM's version which has more than 1000, they require much larger spaces for their qubits to work without interfering with one another"

↑ The current leading superconducting qubit is inconveniently big (= one bit is millimeter ), bulky and energy-inefficient (= they have to be cooled to almost absolute zero ), which cannot be scaled up to millions of qubits required for the practical quantum computer ( this 3~4th-paragraphs ).

↑ This research tried to use "only one single Sb atom" as one qubit (= using two energy levels of a Sb atom as a bit's states 0 and 1,  this p.7-left-gate-fidelities-only one-qubit gate ) with still high-error rate.  ← Using only one bit or qubit (= even two qubit operation has Not been done in this research ) is Not a computer at all.

Still only-one qubit, high error rate, Not a computer at all.

The error rate of one qubit operation in this research is still impractically high = one qubit's error rate is 1 ~ 10% ( this p.17-Table S2,S3 ), which is far higher and worse than the practically-required error rate of less than 10-15.

To explain the observed multiple energy levels of a Sb atom under electromagnetic field, they just fitted free parameters to experiments ( this p.11-S11, p.12-S15 ) with No quantum mechanical prediction nor calculation.

The practical quantum computer is said to need more than millions of qubits ( this 7~8th-paragraphs ).
But even this latest research in 2024 could make only one unstable (atomic) qubit with impractically-high error rate.

So quantum computer research has been already deadend with No progress, rather, they are regressing to only one single useless qubit (= Sadly, publishing papers in journals instead of aiming to invent useful machines is the current research's only purpose ).

 

Quantum network, internet is hoax

(Fig.C)  Quantum cryptography uses very weak classical light as fictitious photon's information which is easily lost and impractical forever.

See Quantum network is hoax.

Quantum information, internet hype

Quntum key distribution is impractical

The 3rd paragraph of this hyped news says
She investigated how the visibility of the so-called Hong-Ou-Mandel effect, a quantum interference effect (= wrong. This can be explained by classical light interference + photodetector that just detects electrons ejected by classical light, Not fictitious photon particles ), is affected by multiphoton contamination."

And the 2nd-last and last paragraphs of the same news say
"The findings could (= just speculation ) be important for quantum key distribution, which is necessary for secure communications in the future (= uncertain future )
However, many questions remain unanswered, Little research has been done into multiphoton effects, so a lot of work is still needed (= still useless quantum key distribution )"

Quantum memory is impractical forever.

Physicists try to develop "quantum memory (= usually some atomic energy levels are used for storing light energy or information )" that can allegedly store information of light or photon transiently.

But even in the latest research in 2023, the storing efficiency of this quantum memory is only 60% (= 40% light or photon information is lost, each time they try to store information in this impractical quantum memory ), which is deadend, unable to make useful quantum internet.

But the 2nd and last paragraphs of this hyped news say
"So far, developing these high-dimensional memories has proved challenging, and most attempts have not yielded satisfactory efficiencies"

"In the future, we will (= uncertain future, still useless ) establish high-dimensional quantum repeaters using high-dimensional quantum memories, enabling high-dimensional quantum communication between two or more remote quantum nodes (← ? )"

Quantum internet, network is joke, impractical forever, contrary to the media-hype.

The 10th, 13th, 14th paragraphs of this recent hyped news say

"A spinning electron (= wrong, an electron's spin is Not real spinning ), or electronic qubit, is very good at interacting with the environment (= wrong, there is still No practical qubit ), while the spinning nucleus of an atom, or nuclear qubit, is not. We've combined a qubit that is well known for interacting easily with light with a qubit that is well known for being very isolated, and thus preserving information for a long time (= this is wrong, too )"

"In general, then, light carries information through an optical fiber to the new device, which includes a stack of several tiny diamond waveguides that are each about 1,000 time smaller than a human hair. Several devices, then, could act as the nodes that control the flow of information in the quantum internet (= wrong, still far from internet )."

"The work described in Nature Photonics involves experiments with (only) One (tiny) device (= Not internet nor network connecting multiple devices at all ). Eventually, however, there could (= just speculation, still useless ) be hundreds or thousands of these on a microchip (= just imagination )"

↑ This research tried to use two atomic energy states or levels (= Not spin itself ) of a tin inside a tiny diamond as (quantum) bits 0 and 1 (= fictitious electron spin itself is unseen, they just measured two energy states interacting with lights, No quantum mechanical prediction nor calculation using spin was done in this research, just exerimental energy values were used, this p.3-left ).

Realization of practical quantum internet or network is impossible forever due to the severe loss of light or photon whose polarization or phase is used as bit information.

Even this latest research (= p.1-left-last~right ) admits
"However, spectral instability and low-rate coherent-photon emission remains an obstacle for scaling up quantum networks (= still impractical )."

First of all, the tiny device allegedly used as qubits in this research must be cooled to extremely low temperature = only 0.4 K (= almost absolute zero ! ), which is impractical ( this p.7-left-2nd-paragraph ).

They reported about 98% of light or photon entering this tiny device was lost, which means light or photon detection efficiency (after interacting with this tiny device) even in this latest research was only 1.7 % (= more than 98% error rate ), as shown in this p.5-6-Table S1

Probability of detecting two photons (= two weak lights ) is miserably low = only 5.0 × 10-5 = 0.00005 (= due to severe photon loss ) as shown in this p.4-left-2nd-paragraph and this Fig.3c.

As a result, quantum internet, quantum network and quantum computer trying to use fragile photon or very weak light as information messenger in vain are impractical forever, because photon detection efficiency or severe photon loss have Not been improved at all despite extremely long-time researches across the world.

Quantum internet is still joke, impractical forever.

The 2nd, 4th, 7th paragraphs of this hyped news say

"A team.. have taken a significant step toward (= meaning "still unrealized" ) the building of a quantum internet testbed by demonstrating a foundational quantum network measurement that employs room-temperature quantum memories. "

"While the vision of a quantum internet system is growing and the field has seen a surge in interest from researchers and the public at large, accompanied by a steep increase in the capital invested, an actual quantum internet prototype has Not been built (= even the prototype of quantum internet has Not been built )"

"They tested how identical these memories (= this quantum memory just means Rb atomic two energy levels' transition storing the incident weak light or photon energy, this p.6 quantum memories, p.2-results ) are in their functionality by sending identical quantum states (= just using weak classical polarized light or photon as quantum information carrier ) into each of the memories and performing a process called Hong-Ou-Mandel (= HOM ) Interference (= which is just interference of two classical lights after passing beam splitters ) on the outputs from the memories, a standard test to quantify the indistinguishability of photon properties."

↑ This research just sent two very weak (classical) light called (fictitious) photons through rubidium (= Rb ) atomic vapor (= each Rb atomic energy levels can store the incident light's energy transiently = only microsecond ) called quantum memory, and got two retrieved lights (= significant photon loss, so useless quantum memory ) interfering with each other at beam splitters.

This p.3-right-4th-paragraph says
"a mean photon number per pulse of ~1.6 serve as the inputs to the memories in a single-rail configuration. After storage and retrieval from each memory, we measure the mean signal photon number to be ~0.01 (= photon number drastically decreases from 1.6 to only 0.01 after stored in the impractical quantum memory ) at the inputs of the beam splitter"

According to this p.3-Table 1, the imput photons (= which is just weak classical light whose light intensity exceeds photodetector's threshold ) suffered significant loss while traveling and getting through quantum memory, and the photon number drastically decreased from 1.6 to only 0.00065 (= photon's detection efficiency is only 0.00065/1.6 = 0.0004 meaning 99.96% photons were lost ), which quantum memory or internet is completely impractical.

This paper p.6-left-3rd-paragraph also says
"several hurdles still exist toward using warm atomic vapor QMs (= quantum memories ) in memory-assisted measurements and operations. As evidenced in this study, mitigating the noise associated with these memories stands as a paramount challenge to achieve indistinguishable retrieved qubits."

As a result, the quantum internet or network is already deadend and impractical (forever) despite an incredible amount of time and money wasted for this (fruitless) research across the world.
Only unfounded hypes misleading laypersons remain.

 

to

2023/11/21 updated. Feel free to link to this site.