I previously stated that the probability of rolling doubles on two six-sided dice is $1/6$. Now I'll show why that's the case.

We can consider rolling one die first and then the other. In order to roll doubles, we can get anything on the first die, but the second die must match it. The probability of getting a specific result on the second die is $1/6$. Thus, the probability of rolling double is $1/6$. We can write this chain of thought as,

\begin{align}

P(\text{doubles on 2d6}) &= \sum_{x=1}^6 P(X_1 = x) \cdot P(X_2 = x) \\

&= \sum_{x=1}^6 P(X_1 = X_2 | X_2 = x) \cdot P(X_2 = x) \\

&= \sum_{x=1}^6 P(X_1 = X_2) \cdot P(X_2 = x) \\

&= P(X_1 = X_2) \cdot \sum_{x=1}^6 P(X_2 = x) \\

&= \frac{1}{6} \sum_{x=1}^6 \cdot P(X_2 = x) \\

&= \frac{1}{6} \cdot 6 \cdot \frac{1}{6} \\

&= \frac{1}{6}

\end{align}

where $X_1$ and $X_2$ are random variables representing the result of the first die and second die, respectively, and $x$ represents the value being matched. In case you aren't familiar with the sigma notation, $\sum_{x=1}^6 f(x)$ is the sum of $f(x)$ over each integer value of $x$ starting at 1 and ending at 6 (note that sometimes the summation limits are written in slightly different places relative to the capital greek letter $\Sigma$). Here $P(X_1 = X_2 | X_2 = x)$ is the probability that the two dice are equal conditioned on $X_2=x$. Since the probability is independent of $x$ (for $x \in {1,2,3,4,5,6}$), the value is a constant $1/6$.

A derivation that is likely easier to grasp mathematically follows.

\begin{align}

P(\text{doubles on 2d6}) &= \sum_{x=1}^6 P(X_1 = x) \cdot P(X_2 = x) \\

&= \sum_{x=1}^6 \frac{1}{6} \cdot \frac{1}{6} \\

&= 6 \cdot \frac{1}{6} \cdot \frac{1}{6} \\

&= \frac{1}{6}

\end{align}

The probability of getting doubles is the sum of the probabilities of all specific cases of getting doubles. Each specific case (e.g. rolling two ones) occurs with probability $\frac{1}{6}\cdot\frac{1}{6}$, since both dice must have that value. There are a total of six cases, one for each possible value.

This generalizes to other types of dice simply. Namely, for two $N$-sided dice (here I'm assuming these dice are numbered 1 through $N$), the probability of rolling doubles is $\frac{1}{N}$. I'll leave showing this as an exercise for the reader (hint: replace every "6" above with "$N$").

But what about two dice of different types? Suppose we have an $M$-sided die and an $N$-sided die (again, numbered 1–$M$ and 1–$N$, respectively — can I use respectively like this in parentheses?). Similarly to before, this is the sum of the probabilities that $X_M$ and $X_N$ are equal for each possible value of $x$.

\begin{align}

P(\text{doubles on }1\text{d}M+1\text{d}N) &= P(X_M = X_N) \\

&= \sum_{x} P(X_M = x)\cdot P(X_N = x)

\end{align}

Since the probabilities $P(X_M = x)$ and $P(X_N = x)$ are only non-zero from 1–$M$ and 1–$N$, respectively, we only need to sum over the range $1 \leq x \leq \text{min}(M, N)$. Without loss of generality, we may assume that $M < N$. Thus,

\begin{align}

P(\text{doubles on }1\text{d}M+1\text{d}N) &= \sum_{x=1}^M P(X_M = x)\cdot P(X_N = x) .

\end{align}

Over this range, we know that probabilities of getting any given value are $1/M$ and $1/N$, respectively.

\begin{align}

P(\text{doubles on }1\text{d}M+1\text{d}N) &= \sum_{x=1}^M \frac{1}{M} \cdot \frac{1}{N} \\

&= M \cdot \frac{1}{M} \cdot \frac{1}{N} \\

&= \frac{1}{N}

\end{align}

This follows our earlier explanation. No matter what we roll on the die with fewer sides, the probability of rolling the same on the second is 1 divided by the number of sides on the die with more sides.

We could envision rolling the die with more faces first, but the computation would be more involved, since the probability of matching given we roll a value not available on the die with fewer faces is zero. The final result must, of course, match.

## No comments:

## Post a Comment