Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Say you are thinking of a number between 1 and 10,000 and give me one thousand attempts to guess it. To make it harder, you change the number every 100 guesses. To make it harder still, you are thinking of changing it every 50 guesses. Would that work? Well, in the first case I get 100 guesses at 10 different numbers, in the second I get 50 guesses at 200 different numbers, but that makes no difference — I get the same number of guesses, and I only have to guess correctly once to succeed. Mathematically, it boils down to the fact that (xa)b=xab.

That's blatantly false, of course it makes a difference, because it makes the events stochastically independent.

If I choose a number between 1 and 10, and I give you 10 chances to guess it, you have 100% chances to guess it. If on the other hand I change it at every guess, you have only ~65% of chances to guess it.

Edit: fixed the % of chances, it is a binomial distribution, and I totally miscalculated it



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: