Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ah, but they can repeat! There is a finite chance that any sequence of any length will occur in a random sequence.

So while a compression alg applied to a very large amount of random data is unlikely to reduce the size, it is quite possible to achieve some compression on smaller blocks.



Any given _example_ of a random sequence may have less than maximum entropy, and so may be compressible, but a truly random sequence _in general_ will not be compressible. So you can make a compression algorithm that will compress random data in some cases but it will never be able to compress any arbitrary random data. Does this help?


You should take a look at the almost 15 year old (and unsolved) "Random Compression Challenge"

http://marknelson.us/2012/10/09/the-random-compression-chall...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: