AI Research

Waterloo Physicists Found a Workaround for Quantum Computing's Copy Protection

Encrypting quantum data as you duplicate it technically doesn't violate the no-cloning theorem. Which feels like cheating, but okay.

Oliver Senti
Oliver SentiSenior AI Editor
January 8, 20264 min read
Share:
Abstract visualization of quantum information being encrypted and duplicated

Researchers at the University of Waterloo have published a method to back up quantum information, something physicists have insisted was impossible for over four decades. The paper, which appeared in Physical Review Letters on January 6th, is titled "Encrypted Qubits can be Cloned." I appreciate the directness.

The trick is almost annoyingly simple: encrypt the quantum state while copying it, and you can make as many duplicates as you want. The catch is that you only get one decryption key, and using it destroys the key. So you end up with a bunch of encrypted copies, only one of which you can ever actually read.

That sounds like a limitation until you realize it's exactly how cloud backups work.

The no-cloning problem

The no-cloning theorem has been a fundamental constraint in quantum physics since Wootters, Zurek, and Dieks formalized it in 1982. (An earlier version appeared in a 1970 paper by James Park, but apparently nobody noticed for twelve years.) The basic idea: you can't perfectly copy an unknown quantum state. Try, and you destroy the original. It's baked into how quantum mechanics works.

This is actually useful for cryptography, since eavesdroppers can't secretly duplicate quantum keys. But it's terrible for everything else. No backups. No redundancy. If your quantum computer loses a qubit, that information is gone.

Researchers have tried workarounds before. Imperfect cloning, probabilistic cloning, broadcasting mixed states. None of them really solved the problem.

The encryption loophole

Achim Kempf and Koji Yamaguchi approached it differently. According to Yamaguchi, who is now at Kyushu University, the method works because encrypting quantum information during copying sidesteps the theorem entirely. You're not creating identical copies of an unknown state. You're creating encrypted copies, and the encryption process uses "noise qubits" that don't carry any information about the original.

Then when you decrypt one copy, the key expires. You've retrieved your data, and all those other encrypted copies become permanently inaccessible. The theorem holds. Everyone's happy.

The press release from Waterloo uses the phrase "quantum Dropbox" several times, which, sure. The point is that you could now store quantum data redundantly across multiple servers. One server fails? Your backup on another server still works. You just can't access both simultaneously.

What's actually new here

I'll be honest, I expected to find a catch buried somewhere in the paper's assumptions. The obvious question: how many resources does this eat?

The paper doesn't hide it. The number of gate operations scales polynomially with the number of copies you want to create. That's not terrible. The noise qubits required also scale with the number of copies, but they're described as "noninteracting ancillas" that can be kept physically separate from your actual computation. So you're not entangling your entire system with garbage.

What I couldn't find is how this interacts with existing quantum error correction schemes. The paper mentions error correction in its literature review but doesn't address whether encrypted cloning could work alongside it or would conflict. That seems like a significant open question for anyone trying to build actual fault-tolerant systems.

The Waterloo angle

The university is clearly positioning this as part of their broader quantum portfolio. The Institute for Quantum Computing has spun out more than 23 startups, according to the announcement. Kempf holds the "Dieter Schwarz Chair in the Physics of Information and AI," which is a chair name I was not previously aware of.

The research was funded by a mix of sources including JSPS, NSERC, and Australia's ARC, plus support from the Perimeter Institute. International money for a paper that could eventually enable distributed quantum computing infrastructure. Make of that what you will.

What happens next

The paper explicitly calls out "quantum multi-cloud storage" as an application. The idea that quantum data could be backed up across multiple cloud providers, with built-in redundancy and encryption, is genuinely new. Previous proposals for quantum cloud services had to accept that data loss was just... unavoidable.

Whether anyone actually builds this is a separate question. We're still at the point where a 100-qubit system is impressive, and 100 qubits can theoretically store more states than all classical computers combined could represent. Scaling that while also implementing encrypted cloning is not a near-term project.

But the theoretical foundation is apparently solid. Forty-plus years of "you can't copy quantum states" has a formal exception now.

Tags:quantum computingUniversity of Waterloophysicsno-cloning theoremquantum information
Oliver Senti

Oliver Senti

Senior AI Editor

Former software engineer turned tech writer, Oliver has spent the last five years tracking the AI landscape. He brings a practitioner's eye to the hype cycles and genuine innovations defining the field, helping readers separate signal from noise.

Related Articles

Stay Ahead of the AI Curve

Get the latest AI news, reviews, and deals delivered straight to your inbox. Join 100,000+ AI enthusiasts.

By subscribing, you agree to our Privacy Policy. Unsubscribe anytime.

Waterloo Physicists Found a Workaround for Quantum Computing's Copy Protection | aiHola