A fast algorithm for parity would be interesting. There is currently no better way to compute parity than factoring N.
There is a well known hand wavy-argument that computing any function on the prime factors of arbitrary integers should be difficult. Very roughly the idea is that if you can compute such a function over the integers you can probably also compute it over other rings. Applying it over certain number fields would let you get information about the low bits of the prime factors which is thought to be hard.
Thanks, the mathoverflow link was very informative. It's funny because I started out computing a number that I thought would be useful for factoring N, but it turned out that it would take one of 2 values depending if N was prime or had 2 factors (this is easily proven). Subsequent testing via software seemed to indicate it was actually determining the parity of the number of primes. I believe that's what it does in general but never bothered to go any further.
I would count it's development and publication as another example of the NSA being a bit ahead as it very much looks like they developed an algorithm in secret with a backdoor. I would count the fact that the public found it (after a few years) a point for public crypto research.
I agree this is a good idea but I think using the words "encrypt" and "decrypt" confuses the goal a little bit as this idea is still useful even if the message aren't required to be confidential form anyone. I think "encapsule" and "release" describe it better. These operations can be used in combination with encrypt and decrypt if encryption is needed.
The only reason to use one-time pads is an extraordinary level of paranoia. If you are paranoid enough to need a one-time pad then it makes little sense for you as a private citizen to trust that hardware you take into public places hasn't been compromised. Manual creation and transmission of the encrypted message is the only approach that makes sense.
If lots of people adopted one-time pads it might hinder NSA et al but this would be akin to trying to convince people to wear masks whenever they are in public to hinder monitoring with CCTV.
I think it is mainly a logistical issue. The problem is that encryption keys are a single point of failure. IT practices at most businesses would have to be very rigorous before a design with a single point of failure for any significant part of the IT infrastructure could even be remotely considered.
Changing business practices to decrease the risk of using end-to-encryption to an acceptable level has an associated cost which, at most companies, probably dwarfs the cost of its technical implementation.
Nitpick: RSA isn't known to be backed by factoring. More precisely, there is no proof that if textbook-RSA encrypted messages could be decrypted quickly that integer factorization could also be done quickly. (Of course if you can factor integers you can decrypt RSA messages quickly).
The author's argument is that the probability of impact on ECC of this line of research is less than probability of an impact on factoring. Post-quantum cryptography has to avoid ECC since the elliptic curve discrete log problem is easily solved by quantum computers.
Regarding the possible solutions other than ECC: There are no hash-based encryption schemes, only hash based signatures. I believe there is little faith in any multivariate scheme to remain secure under serious scrutiny. Many such schemes have been broken. The outlook on lattice cryptography, and some coding theory based schemes is slightly more positive. Apparently ECC has received much more attention and study than either of these.
It makes no sense to argue that a non-existent product is vulnerable to a current exploit without a precise definition or specification of that product.
Its an obvious mistake but not so obvious that the author can't get away with choosing to make it.