But until then it seems to me like something based on the difficulty of attacking hash functions would be a good bet for Q resistant. Totally unsure how to make a PK scheme from that, but it has a few nice properties:
- hashes are often tuneable, you can add more state and increase the keyspace/security
- good hashes don't have any weaknesses that Q can exploit
- hashes are often pretty fast
- hashes are well studied
- hashes seem to be hard in C and hard in Q
Sphere10.com/articles/cryptography/pqc/wots
Signing many things with one identity is possible by precomputing a Merkle tree, but this takes time and the signatures get big.
Yes, it's reasonably fast even at very long lengths, but the main problem is the very long lengths. Code based and not hash based though.
Edit: not very fast to generate a key though. It's mostly used for non-ephemeral stuff.
McEliece has a public key that is a general linear code. A code is a bunch of linear equations constraining codewords, and codewords are vectors over a finite field, and decoding a code is solving those equations subject to errors from a given distribution. Sounds familiar?
They’re not the same problem, and the distributions are different in rather fundamental ways (which may or may not make a difference), but they are quite related. I would not move my eggs to the McEliece basket right now.
Hash-based signatures sound as safe as ever.
One of my favorite sci-fi macguffians, was in the book "Fire upon the deep" where they were space truckers with a cargo of one time pad keys.
The implied context of "post-quantum cryptography" is usually asymmetric cryptography.
However, we don't know if these bounds are tight, or whether they are eg in P, or something in between.
It's kind of trivial to say it's in NP because we can verify in P time, that's not a criticism of you just of the definition!!
I think a better definition of NP is "only nonpoly algos can exist, no P algos can exist". By that definition of NP, we don't even know that it's in NP strictly because there could exist P algorithms for solving it. It's more in 'unknown-NP' if that were a class! hahaha! :)
So maybe it should really be P and NDP.
I like to explain non-determinism in terms of getting a hint, or having an (untrusted) cheatsheet in a test. Or always making lucky guesses (but you don't trust your guesses).
But as long as your parallel executions don't interact at all, the definitions are identical, I think.
Yes? No one ever said it was.
None of the common cryptographic problems are expected to be NP-complete, even if they aren't in P. That's because they are known to be in both NP and in co-NP, and it's expected that NP != co-NP.
> I think a better definition of NP is "only nonpoly algos can exist, no P algos can exist".
In what sense is that a 'better' definition than the standard definition? It sounds like what you are talking about is NP\P (where \ is set subtraction, ie 'NP minus P').
I don't even know what co-NP is. Could you explain?
I think that's a better definition because I find it more predictive and useful to think about: pretty concrete to know that you can't have a polytime algo for it.
Yeah, I guess what you're saying about NP\P is right in that it's a restatement of the definition of what I said, haha! I'm not an expert this is just what I think :)
See https://en.wikipedia.org/wiki/Co-NP That article even mentions integer factorisation.
> I think that's a better definition because I find it more predictive and useful to think about: pretty concrete to know that you can't have a polytime algo for it.
Well, that's a non-standard definition for NP, and you would have a hard time talking to anyone. And at the moment we have no clue whether your 'NP' has any problems in it at at all, or whether it's an empty set. In that sense, it's a very impractical definition.
Btw, there's some nice alternative but equivalent definitions for traditional NP. The classic definition is basically, NP are those problem that you can check in polynomial time if someone gives you a hint (ie they give you the answer and whatever else you need, but you need to verify, you can't trust the hint.)
A nice alternative definition says that with access to randomness, that hint needs to be at most O(log n) long, and you also only need to even look at 3 randomly chosen bits of that short hint, and you are still guaranteed to suss out any fake answer with at least 66% probability. See https://en.wikipedia.org/wiki/PCP_theorem
It's actually a very fascinating definition and question: Are there problems for which we can prove they are in NP but also prove they cannot have polynomial time (P time) solutions?
I did check out that wiki page first, but found it super difficult to parse. Do you have some insight that could help me understand more simply/intuitively??
For instance, I found the definition of NP as P if you have an NFA, to be super easy to understand. But when that wiki starts talking about "certificates" I just have no idea.
That is, co-NP is the set of decision problems where there exists a polynomial {\displaystyle p(n)} and a polynomial-time bounded Turing machine M such that for every instance x, x is a no-instance if and only if: for some possible certificate c of length bounded by {\displaystyle p(n)}, the Turing machine M accepts the pair (x, c).
That's exactly the famous P!=NP question.
> I did check out that wiki page first, but found it super difficult to parse. Do you have some insight that could help me understand more simply/intuitively??
Scott Aaronson might have some good intro material on his blog. Otherwise, you can just ask your favourite search engine (or AI bot) for some intro material.
> For instance, I found the definition of NP as P if you have an NFA, to be super easy to understand. But when that wiki starts talking about "certificates" I just have no idea.
The certificate is the 'cheatsheet' or 'hint'. Basically the question is, how well can you do in an exam where you have to show your work, if someone gives you all the answers? (But that guy is a troll, so you can't trust him, and still need to verify everything.)
Maybe being NP-complete isn’t as important as I realize? Or maybe there’s something about NP-complete problems that make them less amenable to be a valid crypto system?
> Or maybe there’s something about NP-complete problems that make them less amenable to be a valid crypto system?
To simplify a bit, the problem is that to work as a crypto system your particular problems needs to be both in NP and in co-NP. And we know of no problem that is both NP-complete and in co-NP. It's widely conjectured that there is no such problem. See https://en.wikipedia.org/wiki/Co-NP that page even mentions integer factorisation.
That's why you can't just take the NP-complete problem itself as a basis for your cryptosystem, you have to pick some subset of instances that's also in co-NP. And apparently it's almost impossible for us to pick such a subset, but still have the instances be hard enough to solve on average.
> Kyber is an IND-CCA2-secure key encapsulation mechanism (KEM), whose security is based on the hardness of solving the learning-with-errors (LWE) problem over module lattices.
Curious to see if Chen's work will eventually lead to Signal selecting a different algorithm.
> NIST-approved schemes based on lattice problems include Kyber and Dilithium (which I wrote about recently.)
> Chen’s algorithm does not immediately apply to the recently-standardized NIST algorithms such as Kyber or Dilithium.
And it's not just Signal. Apple's new iMessage protocol, PQ3, uses Kyber, too. [1] Most deployments of PQ-crypto that I know of have used Kyber.
https://github.com/mullvad/mullvadvpn-app/blob/main/talpid-t...
Really the only point I'm trying to make here is that there's nothing eyebrow-raising about systems using lattice crypto; after IFP/FFDLP stuff like RSA and ECDLP, lattices are maybe the next most mainstream approach to constructing asymmetric systems.
The good part is that McEliece codes are based on a proven NP-hard algorithm, so cracking them in polynomial time needs P = NP.
There’s nowhere near the same urgency and significantly more denial over global warming. A bit apples and oranges but climate models have a better track record, are wildly conservative (ie our present is much worse than the past climate models predicted) and it’s a real problem we know exists and is a bullet train headed our way.
Global warming is talked about on the radio, TV, movies. It's sung about in songs. There's several conferences, widely-attended protests, stickers on appliances, tax initiatives, etc.
A few comments on obscure sites like HN can hardly be called a panic. It is silly to suggest that there is more urgency about post-quantum attacks on crypto than global warming.
Depending on how you measure it, we theoretically spend about a trillion bucks a year worldwide on climate change ( https://www.climatepolicyinitiative.org/publication/global-l... ) and maybe a couple billion bucks a year on quantum computers.
Eg the US administration like to pretend that climate change is a top priority, but then complains when Chinese tax payers generously subsidise solar cells and electric cars for American consumers.
The environmental controls you are talking about mostly protect the local environment. Protecting the local environment is great, but crudely speaking, the global climate doesn't much care about whether you dump some toxic waste in some part of China.
What I hear in complaints about the exports from the Americans is that they are worried about good old jobs for good old American boys and girls (preferably in a union that is a strong voting block one way or another). So I would take the Americans at their word here.
Yet again, if the local environment in China was a priority of the administration that outweighed concern about climate change, you would see a very different policy. Instead of just putting tariffs and other restrictions on imports of some Chinese goods. You can probably come up with your own examples.
(my favorite maxim from 2012 era atheism debates on youtube)
But yeah I totally agree. Putting the cart before the horse here with the outlined consequences probably isn't smart, but to be fair, I haven't (and probably couldn't) read the paper.
It's a bit older than that: https://en.wikipedia.org/wiki/Extraordinary_claims_require_e...
Quantum Algorithms for Lattice Problems - https://news.ycombinator.com/item?id=39998396 - April 2024 (118 comments)
So much of quantum computing is theory, and so much of current crypto is applied.
Is it realistic to think the first applied quantum computers could quickly get us to a P=NP solution, rendering all crypto effectively irrelevant?