Quantum Computers Are Not A Threat To 128-bit Symmetric Keys

A lot has been made about a post-quantum computer future in which traditional encryption methods have suddenly been rendered obsolete. With this terrifying idea in mind, it’s reassuring to see some recent pushback to the idea with some factual evidence. In a recent blog post by [Filippo Valsorda] – a cryptography engineer – the point is raised that 128-bit symmetric keys like AES-128 and hashing algorithms like SHA-256 are not at risk of being obliterated in a post-quantum future.

Rather than just taking [Filippo]’s word for it, he takes us through a detailed explanation of the flawed understanding of Grover’s algorithm that underlies much of the panic. While it’s very true that this quantum search algorithm can decrease the amount of time required to find a solution, the speed-up with a single thread is quadratic, not exponential. While asymmetric cryptography systems like ECDH, RSA, and kin are very much at risk courtesy of Shor’s algorithm, the same is not true for symmetric systems.

An interesting detail with Grover’s is also that you cannot simply run a search in parallel to get a corresponding speed-up, as it’s not a parallel problem. Barring a breakthrough that replaces Grover’s with something that lends itself better to such a parallel search, it would seem that we won’t have to abandon classical encryption any time soon.

Incidentally, even for Shor’s algorithm, there are still some hold-ups. Current quantum computers are not even able to factor 21 yet. Meanwhile, supposed quantum computing breakthroughs are being trolled with a Commodore 64.

20 thoughts on “Quantum Computers Are Not A Threat To 128-bit Symmetric Keys

  1. They’re not really a threat to anything else either. Every paper where an encryption key has been “broken” engaged in what should honestly be called fraud: keys that are within 5 of each other, keys that are mostly zeros, keys that are structurally defective in other ways. All rendering the encryption completely moot.

    See: https://eprint.iacr.org/2025/1237 (available in PDF, and both very readable and highly entertaining)

    They make a solid argument that teams working on these factorization problems shouldn’t be allowed to know the factors ahead of time…..though current and forseeable quantum computers are so pathetic that any achievable number could be factored near-instantly on Wolfram Alpha.

    Combined with recent realizations that quantum computers are far less useful for chemical simulation than expected, and the recent buzz in the news around “quantum” being the next big thing seems more like an attempt to set up another hype cycle in the vein of crypto, metaverse, and the now-cooling genAI craze.

  2. I believe there will never be a realization of the scale necessary for quantum systems to be a threat. I think this notion is an example of quantum evangelism, a thing they say to drum up investor money. It is a fascinating field, that is completely impractical and for anyone outside that research field it is useless.

    1. It may be possible to build such an argument on the observation that quantum and classical computers are being built using the same technology, and that technology is petering out. But it also seems that the “error correction” required to make a quantum computer work far outstrips any conceivable utility and makes them impossible to build in the first place.

  3. Can someone point me to some resources explaining the “magic” behind quantum computers?

    Whatever you can do with a regular computer, you can also do with pen & paper & lots of patience. What is the “magic” that quantum computing does, that it cannot be reproduced with a non-quantum approach?

    1. 3blue1brown vids, for a start:

      https://www.youtube.com/watch?v=RQWpF2Gb-gU
      https://www.youtube.com/watch?v=Dlsa9EBKDGI

      “What can you do”? Really, nothing new computing-wise, but certain problems can, in theory, be much faster. Like many tantalizing concepts, it always seems to be a year away. When will we get practical implementations at scale, or even one off? I would expect order of 25 years. Too many unknowns to be more precise, as far as I can see (not my field, but I follow the less specialized lit in CommACM, IEEE computer soc, etc, and have a few former students that are in the field)

      OTOH, quantum entanglement DOES provide something unique in communication.

      1. except you cant use entanglement to convey data in any form, its random. you can force a wave function to collapses but you can’t control what the participial collapses into so all the other side knows is you got the opposite polarity.

        no data was transferred.

        on top of that once the wave function collapses the participials are no longer entangled

      2. The problem with using entanglement for communication is you are no longer transporting information. You are transporting a very fragile physical artifact. (a specific particle)

        This would require the wholesale rip out and replacement of all human networking infrastructure, from the backbone of the internet, to the buried telco cables in your neighborhood, to the ethernet in your walls, for nebulous benefit.

        It would also be replacing said system with one that is far less flexible, since any upgrade requires total physical replacement of all the equipment. You can still use dialup with the modern internet, it’s just merely slower. The data can still be copied across from one system to the other. No so with physical quantum systems.

  4. Nobody ever said Quantum Computer were a thread to Symmetric crypto.

    Also SHA is not an encryption algo. It’s a hash function. It is often used in crypto systems, but not to encrypt stuff. But yes, SHA is also not affected by QC.

    I asked Quantum Computer researcher Scott Aaronson about the “not even 15 got factorized” here:
    https://scottaaronson.blog/?p=9665#comment-2029378

    “the actual goal is to factor using Shor’s algorithm on a fault-tolerant QC. That has not yet been done. That’s the thing where, once it’s done for 15, doing it for 2048-bit RSA composites should “merely” be a matter of scaling up.”

    So Quantum Error correction is the obstacle. If someone gets that working and it scales, we might see (small) keys getting cracked sooner than it looks. If we don’t, we’ll never see 15 get factorized.

    1. It’s more of a myth that is persistent in board rooms and among the populace, it would seem. Until I started looking into it myself years ago I too was under that false assumption based on what I had been told.

      And yes, QEC is absolutely the issue here, along with the complexity of scaling up qubit circuits. Right now the need to run each calculation multiple times to use quantum processors with some degree of accuracy is why even a Commodore 64’s 6502 CPU can compete.

      Just need that major QEC breakthrough, which I’m sure will be any day now.

  5. the point is raised that 128-bit symmetric keys like AES-128 and SHA-256 are at risk of being obliterated in a post-quantum future.

    I guess a “not” is accidentally left out, as Filippo’s blog title is literally the opposite, at least for 128 bit symmetric. Of course SHA-256 should not have been mentioned in the same sentence since (as noted above) it’s not a symmetric crypto algorithm. If you’re interested, it’s best to read Filippo’s blog directly, because a lot of nuances are lost in the HAD article.

    1. Thanks for catching that typo. Obviously there should have been a ‘not’ in there. I have also clarified that SHA is a hashing algorithm.

      As for nuance, I wouldn’t claim to be a quantum algorithm researcher, so said nuance is probably best found in the original article regardless :)

  6. How is it not a parallel problem? Its exactly that! You can easily split up your solution space. E.g. with a 1 – 16 valued key, you van easily try all keys 1-8 and 9-15 in parallel at double the speed. I’d argue that scales immensely well. Heck, you could even say, let’s grap a whole bunch of secret keys where we want the private key off together, and while we won’t get the answer to a single problem faster,we getthem all eventually in a single pass.

    Or do I completely miss understand?

    1. (Hypothetical) quantum systems do not parallelize the problem on separate hardware. To test those two ranges of possible key, you need two computers instead of one. A quantum computer can test all the different keys simultaneously on one machine. The physical machine is the same if you’re testing one key, two keys in parallel, 100 keys in parallel, or an infinite number of keys in parallel.

  7. The good news is that security always moves on to better algorithms once theoretical cracks start to show in existing privacy algorithms.

    The world’s largest data store is, as far as I know, in Maryland, and once QC’s eventually become useful that offline data hoard protected with legacy crypto will be data mined.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.