Ah, but they can repeat! There is a finite chance that any sequence of any length will occur in a random sequence.
So while a compression alg applied to a very large amount of random data is unlikely to reduce the size, it is quite possible to achieve some compression on smaller blocks.
Any given _example_ of a random sequence may have less than maximum entropy, and so may be compressible, but a truly random sequence _in general_ will not be compressible. So you can make a compression algorithm that will compress random data in some cases but it will never be able to compress any arbitrary random data. Does this help?
So while a compression alg applied to a very large amount of random data is unlikely to reduce the size, it is quite possible to achieve some compression on smaller blocks.