Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

your question is predicated on a nonexistent distinction. A combination of algorithms is simply a new, more complex algorithm.

That's true if you combine multiple algorithms conventionally, for example by feeding inputs from one to the other or by passing the original entropy to the two algorithms in parallel.

However, if you run one algorithm independently of another - with completely different entropy sources - then flip between supplying the output of each in rapid succession, then on the face of it for ~50% overheads you have probably mostly mitigated exploitable vulnerabilities in either of the two single PRNGs for most applications. (Optionally add another for more security, and take it a step further...)

You are right that this would need to be done carefully .. perhaps the flip in which PRNG is used for the output stream occurs at hard-to-predict points in time (the range of periods for which, potentially, might be carefully chosen from a range that takes in to account some other cryptographically relevant factors ... from a complete non-cryptographer like me, this might include things like entropy draw size by typical applications (optimizing for multiple PRNGs over a typical application draw), the length of any feedback register or calibration cycle of individual cryptographic client algorithms, etc. Generally this would mean 'pretty frequently but not so frequently that it's predictably useful' - something of a paradox, true.)

In short, I still think this approach may have some merit.



> That's true if you combine multiple algorithms conventionally

What I said is true no matter how you combine them, including in the manner you propose.

> flip between supplying the output of each in rapid succession

You've just exposed yourself to a possible related-key attack. Also, the effective entropy of your key could be halved simply by one of the PRNGs being compromised, so even absent a related-key attack, you'd better be using an encryption algorithm with a big key size.

(BTW, it's 100% overhead, not 50% overhead.)

(Oh, and if both algorithms happen to be compromised, the results are or might as well be 0-entropy. Yet again, somewhere along the way you end up trusting something.)

> perhaps the flip in which PRNG is used for the output stream occurs at hard-to-predict points in time

"Hard-to-predict" requires a secure random number generator. You've gone on to describe yet another ad-hoc random number generator that you would use in the construction of your random number generator.

Cryptography is hard.


You've just exposed yourself to a possible related-key attack

It's a tradeoff for mitigating other, perhaps more realistic attack vectors.

the effective entropy of your key could be halved simply by one of the PRNGs being compromised

Halved is a whole lot better than wholly negated, which was the point of the suggestion.

Cryptography is hard.

Yeah.

Architecturally, though, it seems that (dis-)trusting two supposedly PRNG sources is better than all eggs in one basket with one.


Basically, cryptography requires a PRNG whose output is provably indistinguishable from a true RNG. Your algorithm cannot be proven to have this property, and might even be susceptible to some attacks.

Basically, it's a band-aid, but cryptography is rigorous and averse to band-aids, requiring everything to be mathematically proven before use.

It's not enough for an algorithm to "seem reasonable", it must be mathematically proven to have certain properties. It's really an extremely interesting subject to study, I'd give it a go. Join the coursera class on cryptography.


Basically, it's a band-aid, but cryptography is rigorous and averse to band-aids, requiring everything to be mathematically proven before use.

To be honest, I am quite partial to the description of modern economics given by George Soros, namely: the entire thing is based on a false analogy with Newtonian physics.

This is the proverbial 'false sense of security', lent credence by its ... ubiquity ... in the field. Perhaps sometimes mathematicians in general take a not wholly dissimilar bent in their reasoning; ie. they think that because something is proven in a mathematical sense using their current knowledge of viable routes of deducation that it remains 'true' and unassailable.

The threat of someone else having or coming up with a faster physical or logical method of deduction does threaten these assumptions.

The broader goal, then, is not to trust a single PRNG algorithm, or, if possible, branch of mathematics (I am not skilled in that area but heard Pythagorean vs. Elliptic Curve mentioned). I am positing that this level of paranoia is, whilst computationally and resource-wise somewhat more costly, probably a good idea, and that the example of the present article is a good one that demonstrates the efficacy of this line of thinking.

The secondary route of side-channel attacks is also hampered by this strategy, since arbitrary entropy sources (potential side channels) may be (re)assigned to arbitrary PRNG algorithms running in parallel at any time.

In summary: hedge thy bets.


> they think that because something is proven in a mathematical sense using their current knowledge of viable routes of deducation that it remains 'true' and unassailable.

That's because it does.

> The threat of someone else having or coming up with a faster physical or logical method of deduction does threaten these assumptions.

No, it doesn't, because there are no assumptions.

> The broader goal, then, is not to trust a single PRNG algorithm, or, if possible, branch of mathematics.

No, if you don't trust something, use something you do trust. The change you're proposing has more potential to do harm than good, e.g. render nine perfectly good PRNGs broken just because you mixed them with one broken one. If you had only used one PRNG, your adversary had to break that PRNG, but, now that you used ten, he only has to break the weakest one, so you've made things much easier for him.

> whilst computationally and resource-wise somewhat more costly, probably a good idea.

Nope, it probably isn't.

> In summary: hedge thy bets.

In summary, the entire crypto community is smarter than you (or me), and any improvement you (or I) think might work has probably been proven hopelessly broken a thousand times over. There's a reason people use a single PRNG, and the reason is that, when your PRNG's output is provably indistinguishable from true randomness, messing with it will certainly not improve it, and will probably ruin it.


> There's a reason people use a single PRNG

I just want to know what it is.

(Edit as can't reply to child: Sure, but this gets back to the mathematical analysis does not equal real world proof, since we do not know all possible computational (or side channel!) vectors. Assertion otherwise is reminiscent of Socrates: οὖτος μὲν οἴεταί τι εἰδέναι οὐκ εἰδώς, ἐγὼ δέ, ὥσπερ οὖν οὐκ οἶδα, οὐδὲ οἴμαι This man, on one hand, believes that he knows something, while not knowing [anything]. On the other hand, I – equally ignorant – do not believe [that I know anything]. https://en.wikipedia.org/wiki/I_know_that_I_know_nothing ... OK that's weak, but honestly that's how I tend to think (ie. with great cynicism about any assertion) and it serves me alright in general security / non-cryptographic discourse. Basically this approach is treating the PRNG algorithm as a black box, refusing to trust it, and attempting to design a system incorporating the black box along with others claiming the same functionality, based upon that assumption. If this works for any other algorithm - and it seems to - then why not PRNGs? It's not like cryptography has a monopoly as an arbiter of risk. Cryptography is not a silver bullet.)


It's in the next sentence:

> and the reason is that, when your PRNG's output is provably indistinguishable from true randomness, messing with it will certainly not improve it, and will probably ruin it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: