IBM world's largest 1121-qubit-Condor quantum computer is fake, publicity stunt.

Home page
Quantum computer is useless forever

The media, academia hide inconvenient fact that the world's largest IBM quantum computer is fake, illusion.

(Fig.1)  IBM largest Osprey-433 qubits and Condor-1121 qubits are illusion, just publicity stunt with No utility.

No evidence of quantum computer's speed-up based on parallel-universe simultaneous calculations

Overhyped quantum computers are useless, deadend, able to factor only 21 = 3 × 7 by using fake slower Shor's algorithm (= No progress for decades ).

The overhyped quantum computers are said to factor prime numbers faster (= by Shor's algorithm ) by using (fictional) quantum superposition or parallel-universe simultaneous calculations ( this-last-paragraph,  this-last-paragraph ).

But so far, the largest number factored by quantum computers (= still Not computers ) is just 21 = 3 × 7 (= No progress for decades ) by fake slower Shor's algorithm recycling qubits instead of using the ( fictional faster ) parallel-universe simultaneous calculations which have No evidence ( this-9. ).

Methods other than the Shor's algorithm are optimization or quantum annealing (= minimization finding the lowest energy state ) which is useless, just publicity stunt, too error-prone to give right answers.

↑ So they can factor only numbers whose answers are already known beforehand or require only a few qubits ( this-p.1-right-lower or p.4 used only 2 or 4 qubits which cannot factor real numbers whose answers are unknown, one qubit can take only 0 or 1 value, so still Not a quantum computer ).

Google's optimistic estimation of millions of qubits needed for cracking RSA is wrong, because today's quantum computers are already deadend, No progress.

Today's quantum computers with small numbers of qubits (< 1000 ) are far from millions of qubits required for a practical quantum computer that is impossible to realize forever.

Recently, Google estimated that about millions of qubits (= one qubit can take 0 or 1 value ) will be necessary to crack the current RSA encryption, revising the former estimate of 20 million qubits needed.

This Google's paper p.21-4 conclusion says
"I reduced the expected number (= just speculation ) of qubits needed to break RSA2048 from 20 million to 1 million (= still millions of qubits are necessary, which is impossible to realize forever ). "

This-lower-narrowing gap (5/24/2025) says
"While the estimated hardware still doesn't exist,...
the threat remains hypothetical. The hardware to execute such a factoring attack is Not yet available.."  ← still a pipe dream contrary to an incredible amount of overhyped news

↑ This Google estimation is too optimistic and unrealistic like parallel universes.

A practical quantum computer needs more than millions of qubits, which is a pipe dream forever.

Today's largest quantum computer is said to be IBM's 1121-qubit Condor machine (= excluding D-Wave fake quantum computers ).

This-4~5th-paragraphs say "Note: Another company, D-Wave Systems, has previously announced quantum computers featuring thousands of qubits. However, these rely on a technique called quantum annealing with high error rates and are generally Not accepted by researchers as true "universal" quantum computers"

So today's quantum computers with only small numbers of error-prone qubits fall far short of millions of qubits (= quantum bits ) required for a practical quantum computer ( this-4th-paragraph,  this-summary ) that can never be realized.

The media hides quantum computers' frauds and inconvenient fact that IBM's largest 1121-qubit quantum computer is fake, just publicity stunt.

The media and academia are 'silent' about D-Wave's and Microsoft's obvious quantum computer frauds.

The serious problem is that almost all the media and academia hide the inconvenient truth that IBM's largest 433-qubit Osprey and 1121-qubit Condor are fake useless quantum computers, just publicity stunt.

↑ This is like the current media and academia (= including Scott ) are silent and hiding various already-obvious quantum computers' frauds such as D-Wave ( this-p.1,p.3 ) and Microsoft's fictional Majorana quasiparticle qubit.

Because debunking D-Wave and Microsoft's Majorana quantum frauds means denying the current quantum mechanical methods such as quantum annealing optimizaion, (unphysical) topological theory, imaginary ( fractional-charge Majorana, anyon ) quasiparticle models that involve many universities and corporations which have large political power to manipulate the world's media and suppress the inconvenient truth of the present already-deadend physics.

IBM's 433-qubit Osprey quantum computer is fake, useless, just publicity stunt.

In November 2022, IBM unveiled the Osprey 433-qubit quantum computer.

But IBM has stubbornly hided the detailed property and error rate of this Osprey (= even now, data of only 433-qubit Osprey and 1121-qubit Condor elusive quantum computers are unknown, as shown in this-p.3-Table I-QV or quantum volume data is unknown ).

This-13th paragraph says
"IBM has Not yet published any qubit fidelities or quantum volume data on its new Osprey processor. Either, they have Not yet done their benchmark, or the data is bad."

This-Updates-5.6. say
"5. on May 8, 2023 — Osprey is now publicly available."

"6. September 7,2023 — BREAKING NEWS — Osprey is now retired.... No reason given"

↑ So IBM Osprey quantum computer getting a lot of media coverage turned out to be useless, just publicity stunt, secretly retired in just 4 months like a 'ghost'.

IBM's 433 qubit Osprey's error rate was much worse than the old 127-qubit Eagle, so scaling up quantum computers is impossible, just increasing uncontrollable errors.

There was only one research paper on this ghost-like IBM's Osprey quantum computer, this paper ↓

p.12-left-first says "maximally entangled Bell state (= manipulating two qubits so that when qubit-1 is 0, the other qubit-2 is 0 ) has a negativity of N = 0.5, whereas a fully separable (= non-entangled or random ) state has a negativity of N = 0 ( this-p.3-left-last )."

p.15-right-last-paragraph says "a larger 414-qubit graph state prepared on the 433-qubit device ibm seattle (= Osprey ),... The average qubit pair negativity is found to be 0.115 (= far worse or lower than non-error negativity = 0.5 ) "

p.16-Table IV-mean N (= negativity ) of seattle (= Osprey 433-qubit ) is always worse and lower (= Osprey's error rate was higher ) than other old IBM quantum computers with smaller numbers of qubits (= 27~127 qubits ).

↑ So IBM's Osprey 433-qubits' error rate was much worse than the older smaller quantum computers with just 27 ~ 127 qubits.

↑ Google's rough estimation of millions of qubits required for cracking RSA key was too optimistic, inconsistent with the fact that scaling up quantum computers is unrealistic, just increasing error rates.

Even IBM does Not use the ghost-like useless 433-qubit Osprey nor 1121-qubit Condor quantum computers.

Even IBM researchers' latest research ( 5/22/2025 ) used only 52 qubits ( out of their older 127-qubit Eagle called ibm- nazca ) instead of the larger latest 433-qubit Osprey or 1121-qubit Condor, with the help of a classical computer ( this-p.3-Figure 2, p.5-left,  this-lower ).

Other latest IBM researches also used the older 125-qubit Eagle ( this-4th-paragraph, 2/12/2025, or  this-p.2-Fig.1 ).

I could Not find a single IBM's research paper on 433-qubit Osprey nor 1121-Condor ghost-quantum computers, which were fake, just publicity stunt to make today's deadend quantum computers appear to be progressing.

The world's largest IBM's 1121-qubit Condor quantum computer is ghost, illusion, just publicity-stunt.

The 2nd~4th, 7th, 9th paragraphs of this news say

The 1,121-qubit IBM Quantum Condor chip is built on the architecture of its previous flagship, the 127-qubit Eagle chip. In size, it is just shy of the record holder, a 1,125 (= 1225 )-qubit machine unveiled by the company Atom"

"IBM said it won't put any of these Condor processors into its "next-generation" System Two quantum computer"

"Instead, it plans to use the newly launched IBM Quantum Heron processors, which have just 133 qubits each."  ← IBM slighted 1121-qubit Condor.

"For quantum computers to dethrone today's best classical computers, however, scientists would still need a quantum processor with millions of qubits (= today's largest 1000-qubits are still far from millions of qubits required for a useful quantum computer )."

"That's why IBM is far more excited by its Heron chip — its error rate is five times lower than that of the behemoth Condor."

↑ So IBM's largest 1121-qubit Condor's error rate is 5 times worse than its much-smaller Heron 133-qubit (or 156-qubit ) quantum computer ( this-lower-Key Specifications-table-error rate ).

This-last-paragraph says
"In fact, the original Heron demonstrated a 5× lower error rate than the much bigger Condor prototype (= IBM's largest 1121-qubit Condor is impractical, too error-prone. Scaling up quantum computers is impossible )"

Scaling up unstable neutral atomic qubits is impossible.

Atom computing and QuEra try to use the energy states of each unstable neutral atom trapped in laser light as a qubit's 0 or 1, which atomic qubit is useless, easily lost ( this-p.1-left-lower ).

So scaling up these extremely-unstable neutral atomic qubits without increasing errors is impossible forever ( this-last-wrapping up ) like other qubits.

This-8th-paragraph says
"Indeed, Atom Computing's new computer is bigger, but not necessarily more powerful. The whole thing can't yet be used to run a single calculation, for example, due to the accumulation of errors as the qubit count rises."

This-3rd paragraph (4/25/2025) says
"Atoms have finite lifetimes, and processes involved in preparing the register can fail, making it increasingly difficult to scale up. The risk of losing atoms grows exponentially with the number of atoms"

↑ This latest Pasqal's atomic qubit's number of 507 qubits in 2025 is lower and worse than 1180 qubits of Atom computing in 2023.

↑ The numbers of unstable neutral atomic qubits are getting worse, regressing, so scaling up atomic qubits is also impossible.

Even IBM's 'true' largest quantum computer 156-qubit Heron is hopeless, too error-prone.

IBM clearly ignores the useless error--prone largest 1121-qubit Condor quantum computer, and focuses on the much smaller 156-qubit Heron.

But even in the latest research on this IBM's hope Heron quantum computer in March 2025, they could use only 15 error-pone qubits (out of 156-qubit Heron r2 quantum computer called ibm-fez, this-p.2-right-2nd-paragraph-last ) whose error rate was terrible, too high = 50% (= fidelity of 15 qubits flipped just 20 times or layers was 50% with 50% error rate, this-p.7-Fig.6 ), which is completely useless, unable to give right answers at all.

Chinese largest Tianyan quantum computer with 504-qubit Xiaohong chip lacking error correction is useless, hiding detailed information.

Chinese largest 504-qubit Tianyan (or Xiaohong ) quantum computer is also impractical, hiding detailed error rates.

This-4th-paragraph says
"Xiaohong is not about immediately breaking computing records or achieving quantum supremacy (= scaling up quantum computers cannot lead to quantum advantage nor speed-up, contrary to Google's optimistic estimate )"

"Xiaohong is Not a computer per se"  ← So this Chinese largest 504-qubit quantum computer is still Not a computer, but just publicity stunt like IBM's Condor, Osprey.

This-Technical advances-4th-paragraph says
"given the noise and lack of error correction (= cannot correct errors of 504 qubits )"

This-last-paragraph says
"It's important to note that the specifications of these devices have not been independently verified,"

This-application to Willow and Xiaohong says
"Xiaohong:... with 504 qubits but lacks publicly validated logical qubit data."

↑ The error rate of this largest 504-qubit quantum computer (= still Not a computer ) is so bad that they need to hide true information.

It is impossible to scale up quantum computers more than 150 qubits without increasing errors.

As a result, today's quantum computers can Not be scaled up more than 150 qubits due to increasing errors, hence, the dreamlike millions qubits for a practical quantum computer are impossible.

 

to

Feel free to link to this site.