IBM largest 1121-qubit-Condor quantum computer is fake, publicity stunt.

Home page
Quantum computer is useless
IBM-HSBC fake news

IBM's largest quantum computer was a lie.

IBM paradoxically reduced numbers of qubits from 1121 to 120 now, meaning their largest quantum computer was just publicity stunt.

(Fig.1)  IBM largest Osprey-433 qubits and Condor-1121 qubits are illusion, just publicity stunt with No utility.

Million-qubit quantum computers are impossible.

A practical quantum computer is said to need millions of qubits which is impossible to realize forever.

A practical quantum computer is said to need millions of qubits (= one qubit can take only 0 or 1 value ).

But today's quantum computers have far less qubits as shown in Google's Willow with just 105 superconducting qubits, Quantinuum with just 98 unstable ion qubits (= unscalable ).

Even IBM's latest quantum computer still has only 120 qubits, which falls far short of millions of qubits required for a practical computer ( this-4th-paragraph,  this-11th-paragraph ).

Less than million qubits is useless.

Today's quantum computers such as IBM, Google, Quantinuum are impractical, far less millions of qubits required for useful calculations.

This-last-paragraph (2025) says
" Google and Microsoft say a truly useful quantum computer will need 1 million qubits ( this-p.21-4. Conclusion )"

This-6th-paragraph (2025) says  -- Million qubits needed
"Commercially useful quantum computers are expected to require millions of physical qubits. We're Not there yet."

This or this-9th-paragraph says  -- Useless 98-qubits
"They also ran experiments with 50 and 96 logical qubits, but the results weren't as impressive... in the future (= still unrealized ), when scientists scale them up to millions of qubits — which is necessary to outpace the fastest supercomputer"

IBM's 1121-qubit quantum computer is fake.

IBM's 433-qubit-Osprey and 1121-qubit-Condor were just publicity stunt to make their deadend quantum computers appear to be progressing.

The serious problem is that almost all the media and academia hide the inconvenient truth that IBM's largest 433-qubit Osprey and 1121-qubit Condor are fake useless quantum computers, just publicity stunt.

IBM reduced qubits from 1121 to 120.  ← scam.

IBM's qubits drastically decreased from 1121 in 2023 to 120 now, which shows their largest quantum computer Condor was fake, just publicity stunt.

IBM drastically reduced numbers of qubits from 1121 of Condor in 2023 to just 120 of Nighthawk or 112 of Loon now in 2025, which shows their largest quantum computer Condor was just publicity stunt.

This or this-8th-paragraph says  -- IBM scam
"In December 2023, for example, IBM scientists built a massive 1,000-qubit chip, named Condor, but its much smaller 127-qubit cousin, Eagle, was deemed the more exciting prospect from a research standpoint, given its error rate was five times lower. The same can be said for Nighthawk (= just 120 qubits ) compared with Loon (= just 112 qubits now in 2025 )."

↑ So IBM's largest 1121-qubit quantum computer was impractical, just publicity-stunt, much more error-prone than other smaller machines.

Still far from the practical million qubits.

This-middle-Quantum computing and Bitcoin (2025) says
"Quantum computing remains at an early stage."

"With the necessary error correction, that translates to tens of millions of physical qubits."

"By comparison, IBM's new Quantum Nighthawk processor contains 120 qubits.. Its capabilities are still an order of magnitude below what is needed to crack the first cryptocurrency's cryptography."

IBM quantum computers are useless.

IBM's 433-qubit Osprey quantum computer is fake, useless, just publicity stunt.

In November 2022, IBM unveiled the Osprey 433-qubit quantum computer.

But IBM has stubbornly hided the detailed property and error rate of this Osprey (= even now, data of only 433-qubit Osprey and 1121-qubit Condor elusive quantum computers are unknown, as shown in this-p.3-Table I-QV or quantum volume data is unknown ).

This-13th paragraph says  -- Osprey was fake
"IBM has Not yet published any qubit fidelities or quantum volume data on its new Osprey processor. Either, they have Not yet done their benchmark, or the data is bad."

This-Updates-5.6. say  -- Osprey scam
"5. on May 8, 2023 — Osprey is now publicly available."

"6. September 7,2023 — BREAKING NEWS — Osprey is now retired.... No reason given"

↑ So IBM Osprey quantum computer getting a lot of media coverage turned out to be useless, just publicity stunt, secretly retired in just 4 months like a ghost.

IBM's 1121-qubit, 433-qubit machines are fake.

Even IBM did Not use the ghost-like useless 433-qubit Osprey nor 1121-qubit Condor quantum computers.

Even IBM researchers' latest research ( 5/22/2025 ) used only 52 qubits ( out of their older 127-qubit Eagle called ibm- nazca ) Not the larger 433-qubit Osprey nor 1121-qubit Condor, with the help of a classical computer ( this-p.3-Figure 2, p.5-left,  this-lower ).

Other latest IBM researches also used the older 125-qubit Eagle ( this-4th-paragraph, 2/12/2025, or  this-p.2-Fig.1 ).

Even the recent dubious HSBC research (= 9/2025 ) used only 109 qubits ( this-step 1 ).

I could Not find a single IBM's research paper on 433-qubit Osprey nor 1121-Condor ghost-quantum computers, which were fake, just publicity stunt to make today's deadend quantum computers appear to be progressing.

IBM 1121-qubit Condor was just publicity-stunt.

IBM's largest 1121-qubit Condor quantum computer was useless, too error-prone, so Nobody used it.

The 2~4th, 7th, 9th paragraphs of this or this say  -- Fake 1121 qubits

The 1,121-qubit IBM Quantum Condor chip is built on the architecture of its previous flagship, the 127-qubit Eagle chip. In size, it is just shy of the record holder, a 1,125 (= 1225 )-qubit machine unveiled by the company Atom"

"IBM said it won't put any of these Condor processors into its "next-generation" System Two quantum computer"

"Instead, it plans to use the newly launched IBM Quantum Heron processors, which have just 133 qubits each."  ← IBM slighted 1121-qubit Condor.

"For quantum computers to dethrone today's best classical computers, however, scientists would still need a quantum processor with millions of qubits (= today's largest 1000-qubits are still far from millions of qubits required for a useful quantum computer )."

"That's why IBM is far more excited by its Heron chip — its error rate is five times lower than that of the behemoth Condor."

↑ So IBM's largest 1121-qubit Condor quantum computer was too error-prone and far more useless than other smaller machines.

Scaling up quantum computers is impossible.

Scaling up to 1000 qubits increases errors, so quantum computers are impractical forever.

↑ So IBM's largest 1121-qubit Condor's error rate is 5 times worse than its much-smaller Heron 133-qubit (or 156-qubit ) quantum computer ( this-lower-Key Specifications-table-error rate ).

This-last-paragraph says  -- Useless, error-prone Condor
"In fact, the original Heron demonstrated a 5× lower error rate than the much bigger Condor prototype (= IBM's largest 1121-qubit Condor is impractical, too error-prone. Scaling up quantum computers is impossible )"

IBM 156-qubit Heron is useless, error-prone.

Even IBM's 'true' largest quantum computer 156-qubit Heron is hopeless, too error-prone.

IBM clearly ignores the useless error--prone largest 1121-qubit Condor quantum computer, and focuses on the much smaller 156-qubit Heron.

But even in the latest research on this IBM's hope Heron quantum computer in March 2025, they could use only 15 error-pone qubits (out of 156-qubit Heron r2 quantum computer called ibm-fez, this-p.2-right-2nd-paragraph-last ) whose error rate was terrible, too high = 50% (= fidelity of 15 qubits flipped just 20 times or layers was 50% with 50% error rate, this-p.7-Fig.6 ), which is completely useless, unable to give right answers at all.

IBM uses classical computers to correct errors.

IBM relies on classical computer's decoders to seemingly correct quantum computer errors, which is impractical.

IBM's latest quantum computers such as 112-qubit Loon heavily rely on classical computer's decoder to seemingly correct errors.

This-7th-paragraph says  -- Classical error correction
"IBM achieved a milestone in quantum error correction decoding—using classical computing hardware to accurately decode quantum errors ( this-4th-last-paragraph )."

Classical computers cannot fix error-prone quantum computers.

This-lower-Toward real-time quantum processing (2025) says  -- Still useless
" This is an essential milestone but still short of full quantum processing,... To get there, the decoding (= classical computer correcting quantum computer's errors ) must become even faster and smaller."

The 5-6th, 13th paragraphs of this (10/2025) say  -- cannot correct errors
"When IBM introduced its new error-correcting algorithm earlier this year, it proposed running it on a decoder called BP+OSD. The decoder is the classical algorithm"

"It's not very accurate, so your error-correcting code isn't as efficient as it could be,.. it wasn't fast enough"

"A useful fault-tolerant quantum computer... still depends more on advances in hardware physics (= quantum computers ) than on classical simulation results"  ← IBM's classical decoders cannot correct quantum computer's errors.

↑ This IBM's latest classical computer's decoder (= called relay-BP ) relies on artificial error models or freely-chosen parameters called "memory strength ( this-p.5-right-last-paragraph )", which uncertain error-correction is impractical, cannot be scaled up.

Neutral atomic qubits are useless forever.

Neutral atomic qubits unstably trapped in laser light are too easily-lost, too error-prone, useless forever.

Atom computing and QuEra try to use the energy states of each unstable neutral atom trapped in laser light as a qubit's 0 or 1, which atomic qubit is useless, easily lost ( this-p.1-left-lower ).

So scaling up these extremely-unstable neutral atomic qubits without increasing errors is impossible forever ( this-last-wrapping up ) like other qubits.

This-8th-paragraph says  -- Error-prone atoms
"Indeed, Atom Computing's new computer is bigger, but not necessarily more powerful. The whole thing can't yet be used to run a single calculation, for example, due to the accumulation of errors as the qubit count rises."

This-3rd paragraph (2025) says  -- Lost atomic qubits
"Atoms have finite lifetimes, and processes involved in preparing the register can fail, making it increasingly difficult to scale up. The risk of losing atoms grows exponentially with the number of atoms"

The recent 6100-atomic qubits are fake, Not connecting qubits (= just separated disconnected 6100 atoms are Not computers ).

Chinese largest Tianyan quantum computer with 504-qubit Xiaohong chip lacking error correction is useless, hiding detailed information.

Chinese largest 504-qubit Tianyan (or Xiaohong ) quantum computer is also impractical, hiding detailed error rates.

This-4th-paragraph says  -- Still useless
"Xiaohong is not about immediately breaking computing records or achieving quantum supremacy (= scaling up quantum computers cannot lead to quantum advantage nor speed-up, contrary to Google's optimistic estimate )"

"Xiaohong is Not a computer per se"  ← So this Chinese largest 504-qubit quantum computer is still Not a computer, but just publicity stunt like IBM's Condor, Osprey.

This-Technical advances-4th-paragraph says  -- Error-prone
"given the noise and lack of error correction (= cannot correct errors of 504 qubits )"

This-last-paragraph says  -- Hide secret
"It's important to note that the specifications of these devices have not been independently verified,"

This-application to Willow and Xiaohong says
"Xiaohong:... with 504 qubits but lacks publicly validated logical qubit data."

↑ The error rate of this largest 504-qubit quantum computer (= still Not a computer ) is so bad that they need to hide true information.

It is impossible to scale up quantum computers to more than 150 qubits without increasing errors.

As a result, today's quantum computers can Not be scaled up more than 150 qubits due to increasing errors, hence, the dreamlike millions qubits for a practical quantum computer are impossible.

 

to

Feel free to link to this site.