Originally Posted by
kewl
But if you follow the original question word by word you'll get the inevitable 1 in 11 ratio, so how can it be?
Let's make it even more childish.
Imagine a pair of dice which ,for the purpose of better visual explanation, always rolls a perfect distribution of all 36 combinations on every 36 rolls.
So that, when you roll them 36 times you get all 36 combos one after the other, no repeats. I'm guessing we can all agree that is not how it is in reality, but it is a simple representation of what the distribution is expected to be in infinity since its derived from the pure fixed probability for each and every die.
So you roll those dice 36 times and you get 25 "no deuce"'s and 11 "at least one deuce"'s. With exactly one "deuce-deuce" among those 11 "at least one deuce"'s, of course.
And again, new 36 rolls and the result is again - 11 "at least one deuce"'s and exactly one 2-2.
So, if you are to bet $1 on those 11 "at least one deuce"'s ,which the peeker meticulously announces, at odds of 8 for 1 that a pair of two's will be under the cup , you'd have lost $2 total.
So how can that be?
How can it be that a 1 in 6 probability of "the other die" being a 2 actually yields an 1 in 11 actual results?
Doesn't this tell us that the dice themselves actually agree with the 1/11 bunch?
Doesn't this mean that 1 in 6 is actually the answer to a different question?