Skip to main content
Chemistry LibreTexts

19.1: Distribution of Results for Multiple Trials with Two Possible Outcomes

  • Page ID
    151781
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Suppose that we have two coins, one minted in 2001 and one minted in 2002. Let the probabilities of getting a head and a tail in a toss of the 2001 coin be \(P_{H,1}\) and \(P_{T,1}\), respectively. We assume that these outcomes exhaust the possibilities. From the laws of probability, we have: \(1=\left(P_{H,1}+P_{T,1}\right)\). For the 2002 coin, we have \(1=\left(P_{H,2}+P_{T,2}\right)\). The product of these two probabilities must also be unity. Expanding this product gives

    \[ \begin{align*} 1 &=\left(P_{H,1}+P_{T,1}\right)\left(P_{H,2}+P_{T,2}\right) \\[4pt] &=P_{H,1}P_{H,2}+P_{H,1}P_{T,2}+P_{T,1}P_{H,2}+P_{T,1}P_{T,2} \end{align*} \nonumber \]

    This equation represents the probability of a trial in which we toss the 2001 coin first and the 2002 coin second. The individual terms are the probabilities of the possible outcomes of such a trial. It is convenient to give a name to this latter representation of the product; we will call it the expanded representation of the total probability sum.

    Our procedure for multiplying two binomials generates a sum of four terms. Each term contains two factors. The first factor comes from the first binomial; the second term comes from the second binomial. Each of the four terms corresponds to a combination of an outcome from tossing the 2001 coin and an outcome from tossing the 2002 coin. Conversely, every possible combination of outcomes from tossing the two coins is represented in the sum. \(P_{H,1}P_{H,2}\) represents the probability of getting a head from tossing the 2001 coin and a head from tossing the 2002 coin. \(P_{H,1}P_{T,2}\) represents the probability of getting a head from tossing the 2001 coin and a tail from tossing the 2002 coin, etc. In short, there is a one-to-one correspondence between the terms in this sum and the possible combinations of the outcomes of tossing these two coins.

    This analysis depends on our ability to tell the two coins apart. For this, the mint date is sufficient. If we toss the two coins simultaneously, the four possible outcomes remain the same. Moreover, if we distinguish the result of a first toss from the result of a second toss, etc., we can generate the same outcomes by using a single coin. If we use a single coin, we can represent the possible outcomes from two tosses by the ordered sequences \(HH\), \(HT\), \(TH\), and \(TT\), where the first symbol in each sequence is the result of the first toss and the second symbol is the result of the second toss. The ordered sequences \(HT\) and \(TH\) differ only in the order in which the symbols appear. We call such ordered sequences permutations.

    Now let us consider a new problem. Suppose that we have two coin-like slugs that we can tell apart because we have scratched a “\(1\)” onto the surface of one and a “\(2\)” onto the surface of the other. Suppose that we also have two cups, one marked “\(H\)” and the other marked “\(T\).” We want to figure out how many different ways we can put the two slugs into the two cups. We can also describe this as the problem of finding the number of ways we can assign two distinguishable slugs (objects) to two different cups (categories). There are four such ways: Cup \(H\) contains slugs \(1\) and \(2\); Cup \(H\) contains slug \(1\) and Cup \(T\) contains slug \(2\); Cup \(H\) contains slug \(2\) and Cup \(T\) contains slug \(1\); Cup \(T\) contains slugs \(1\) and\(\ 2\).

    We note that, given all of the ordered sequences for tossing two coins, we can immediately generate all of the ways that two distinguishable objects (numbered slugs) can be assigned to two categories (Cups \(H\) and \(T\)). For each ordered sequence, we assign the first object to the category corresponding to the first symbol in the sequence, and we assign the second object to the category corresponding to the second symbol in the sequence.

    In short, there are one-to-one correspondences between the sequences of probability factors in the total probability sum, the possible outcomes from tossing two distinguishable coins, the possible sequences of outcomes from two tosses of a single coin, and the number of ways we can assign two distinguishable objects to two categories. (See Table 1.)

    Table 1.
    Problems Correspondences
    Sequences of probability factors in the total probability sum \(P_{H,1} P_{H,2}\) \(P_{H,1} P_{T,2}\) \(P_{T,1} P_{H,2}\) \(P_{T,1} P_{T,2}\)
    Probability factors for coins distinguished by identification numbers \(P_H P_H\) \(P_H P_T\) \(P_T P_H\) \(P_T P_T\)
    Sequences from toss of a single coin \(HH\) \(HT\) \(TH\) \(TT\)
    Assignments of two distinguishable objects to two categories Cup \(H\) holds slugs 1 & 2 Cup \(H\) holds slug 1 and Cup \(T\) holds slug 2 Cup \(H\) holds slug 2 and Cup \(T\) holds slug 1 Cup \(T\) holds slugs 1 & 2

    If the probability of tossing a head is constant, we have \(P_{H,1}=P_{H,2}=P_H\) and \(P_{T,1}=P_{T,2}=P_T\). Note that we are not assuming \(P_H=P_T\). If we do not care about the order in which the heads and tails appear, we can simplify our equation for the product of probabilities to

    \[1=P^2_H+2P_HP_T+P^2_T \nonumber \]

    \(P^2_H\) is the probability of tossing two heads, \(P_HP_T\) is the probability of tossing one head and one tail, and \(P^2_T\) is the probability of tossing two tails. We must multiply the \(P_HP_T\)-term by two, because there are two two-coin outcomes and correspondingly two combinations, \(P_{H,1}P_{T,2}\) and \(P_{T,1}P_{H,2}\), that have the same probability, \(P_HP_T\). Completely equivalently, we can say that the reason for multiplying the \(P_HP_T\)-term by two is that there are two permutations, \(HT\) and \(TH\), which correspond to one head and one tail in successive tosses of a single coin.

    We have lavished considerable attention on four related but very simple problems. Now, we want to extend this analysis—first to tosses of multiple coins and then to situations in which multiple outcomes are possible for each of many independent events. Eventually we will find that understanding these problems enables us to build a model for the behavior of molecules that explains the observations of classical thermodynamics.

    If we extend our analysis to tossing \(n\) coins, which we label coins \(1\), \(2\), etc., we find:

    \[ \begin{align*} 1 &=\left(P_{H,1}+P_{T,1}\right)\left(P_{H,2}+P_{T,2}\right)\dots \left(P_{H,n}+P_{T,n}\right) \\[4pt] &=\left(P_{H,1}P_{H,2}\dots P_{H,n}\right)+\left(P_{H,1}P_{H,2}{\dots P}_{H,i}\dots P_{T,n}\right)+\dots +\left(P_{T,1}P_{T,2}\dots P_{T,i}\dots P_{T,n}\right) \end{align*} \nonumber \]

    We write each of the product terms in this expanded representation of the total-probability sum with the second index, \(r\), increasing from \(1\) to \(n\) as we read through the factors, \(P_{X,r}\), from left to right. Just as for tossing only two coins:

    1. Each product term is a sequence of probability factors that appears in the total probability sum.
    2. Each product term corresponds to a possible outcome from tossing n coins that are distinguished from one another by identification numbers.
    3. Each product term is equivalent to a possible outcome from repeated tosses of a single coin: the \(r^{th}\) factor is \(P_H\) or \(P_T\) according as the \(r^{th}\) toss produces a head or a tail.
    4. Each product term is equivalent to a possible assignment of n distinguishable objects to the two categories \(H\) and \(T\).

    In Section 3.9, we introduce the term population set to denote a set of numbers that represents a possible combination of outcomes. Here the possible combinations of outcomes are the numbers of heads and tails. If in five tosses we obtain \(3\) heads and \(2\) tails, we say that this group of outcomes belongs to the population set \(\{3,2\}\). If in \(n\) tosses, we obtain \(n_H\) heads and \(n_T\) tails, this group of outcomes belongs to the population set \(\{n_H,n_T\}\). For five tosses, the possible population sets are \(\left\{5,0\right\}\), \(\left\{4,1\right\}\), \(\left\{3,2\right\}\), \(\left\{2,3\right\}\), \(\left\{1,4\right\}\), and \(\left\{5,0\right\}\). Beginning in the next chapter, we focus on the energy levels that are available to a set of particles and on the number of particles that has each of the available energies. Then the number of particles, \(N_i\), that have energy \({\epsilon }_i\) is the population of the \({\epsilon }_i\)-energy level. The set of all such numbers is the energy-level population set for the set of particles.

    If we cannot distinguish one coin from another, the sequence \(P_{H,1}P_{T,2}P_{H,3}P_{H,4}\) becomes \(P_HP_TP_HP_H\). We say that \(P_HP_TP_HP_H\) is distinguishable from \(P_HP_HP_TP_H\) because the tails-outcome appears in the second position in \(P_HP_TP_HP_H\) and in the third position in \(P_HP_HP_TP_H\). We say that \(P_{H,1}P_{T,2}P_{H,3}P_{H,4}\) and \(P_{H,3}P_{T,2}P_{H,1}P_{H,4}\)are indistinguishable, because both become \(P_HP_TP_HP_H\). In general, many terms in the expanded form of the total probability sum belong to the population set corresponding to \(n_H\) heads and \(n_T\) tails. Each such term corresponds to a distinguishable permutation of \(n_H\) heads and \(n_T\) tails and the corresponding distinguishable permutation of \(P_H\) and \(P_T\) terms.

    We use the notation \(C\left(n_H,n_T\right)\) to denote the number of terms in the expanded form of the total probability sum in which there are \(n_H\) heads and \(n_T\) tails. \(C\left(n_H,n_T\right)\) is also the number of distinguishable permutations of \(n_H\) heads and \(n_T\) tails or of \(n_H\) P\({}_{H}\)-terms and \(n_T\) P\({}_{T}\)-terms. The principal goal of our analysis is to find a general formula for \(C\left(n_H,n_T\right)\). To do so, we make use of the fact that \(C\left(n_H,n_T\right)\) is also the number of ways that we can assign \(n\) objects (coins) to two categories (heads or tails) in such a way that \(n_H\) objects are in one category (heads) and \(n_T\) objects are in the other category (tails). We also call \(C\left(n_H,n_T\right)\) the number of combinations possible for distinguishable coins in the population set \(\{n_H,n_T\}\).

    The importance of \(C\left(n_H,n_T\right)\) is evident when we recognize that, if we do not care about the sequence (permutation) in which a particular number of heads and tails occurs, we can represent the total-probability sum in a much compressed form:

    \[1=P^n_H+nP^{n-1}_HP_T+\dots +C\left(n_H,n_T\right)P^{n_H}_HP^{n_T}_T+nP_HP^{n-1}_T+P^n_T \nonumber \]

    In this representation, there are \(n\) terms in the total-probability sum that have \(n_H=n-1\) and \(n_T=1\). These are the terms

    \[P_{H,1}P_{H,2}P_{H,3}{\dots P}_{H,i}\dots P_{H,n-1}{\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{n}} \nonumber \] \[P_{H,1}P_{H,2}P_{H,3}{\dots P}_{H,i}\dots {\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{n}\boldsymbol{-}\boldsymbol{1}}P_{H,n} \nonumber \] \[P_{H,1}P_{H,2}P_{H,3}\dots {\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{i}}\dots P_{H,n-1}P_{H,n} \nonumber \]

    \[P_{H,1}P_{H,2}{\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{3}}{\dots P}_{H,i}\dots P_{H,n-1}P_{H,n} \nonumber \] \[P_{H,1}{\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{2}}P_{H,3}{\dots P}_{H,i}\dots P_{H,n-1}P_{H,n} \nonumber \] \[{\boldsymbol{P}}_{\boldsymbol{T},\boldsymbol{1}}P_{H,2}P_{H,3}{\dots P}_{H,i}\dots P_{H,n-1}P_{H,n} \nonumber \]

    Each of these terms represents the probability that \(n-1\) heads and one tail will occur in the order shown. Each of these terms has the same value. Each of these terms is a distinguishable permutation of \(n-1\) \(P_H\) terms and one \(P_T\) term. Each of these terms corresponds to a combination in which one of n numbered slugs is assigned to Cup \(T\), while the remaining \(n-1\) numbered slugs are assigned to Cup \(H\). It is easy to see that there are \(n\) such terms, because each term is the product of \(n\) probabilities, and the tail can occur at any of the \(n\) positions in the product. If we do not care about the order in which heads and tails occur and are interested only in the value of the sum of these \(n\) terms, we can replace these \(n\) terms by the one term \(nP^{n-1}_HP_T\). We see that \(nP^{n-1}_HP_T\) is the probability of tossing \(n-1\) heads and one tail, irrespective of which toss produces the tail.

    There is another way to show that there must be \(n\) terms in the total-probability sum in which there are \(n-1\) heads and one tail. This method relies on the fact that the number of such terms is the same as the number of combinations in which n distinguishable things are assigned to two categories, with \(n-1\) of the things in one category and the remaining thing in the other category, \(C\left(n-1,1\right)\). This method is a little more complicated, but it offers the great advantage that it can be generalized.

    The new method requires that we think about all of the permutations we can create by reordering the results from any particular series of \(n\) tosses. To see what we have in mind when we say all of the permutations, let \(P_{X,k}\) represent the probability of toss number \(k\), where for the moment we do not care whether the outcome was a head or a tail. When we say all of the permutations, we mean the number of different ways we can order (permute) n different values \(P_{X,k}\). It is important to recognize that one and only one of these permutations is a term in the total-probability sum, specifically:

    \[P_{X,1}P_{X,2}P_{X,3}\dots P_{X,k}\dots P_{X,n} \nonumber \]

    in which the values of the second subscript are in numerical order. When we set out to construct all of these permutations, we see that there are \(n\) ways to choose the toss to put first and \(n-1\) ways to choose the toss to put second, so there are \(n\left(n-1\right)\) ways to choose the first two tosses. There are \(n-2\) ways to choose the third toss, so there are \(n\left(n-1\right)\left(n-2\right)\) ways to choose the first three tosses. Continuing in this way through all \(n\) tosses, we see that the total number of ways to order the results of n tosses is \(n\left(n-1\right)\left(n-2\right)\left(n-3\right)\dots \left(3\right)\left(2\right)\left(1\right)=n!\)

    Next, we need to think about the number of ways we can permute \(n\) values \(P_{X,k}\) if \(n-1\) of them are \(P_{H,1}\), \(P_{H,2}\),, \(P_{H,r-1},\), \(P_{H,r+1},\dots ,P_{H,n}\) and one of them is \(P_{T,r}\), and we always keep the one factor \(P_{T,r}\) in the same position. By the argument above, there are \(\left(n-1\right)!\) ways to permute the values \(P_{H,s}\) in a set containing \(n-1\) members. So for every term (product of factors \(P_{X,k}\)) that occurs in the total-probability sum, there are \(\left(n-1\right)!\) other products (other permutations of the same factors) that differ only in the order in which the \(P_{H,s}\) appear. The single tail outcome occupies the same position in each of these permutations. If the \(r^{th}\) factor in the term in the total probability sum is \(P_{T,r}\), then \(P_{T,r}\) is the \(r^{th}\) factor in each of the \(\left(n-1\right)!\) permutations of this term. This is an important point, let us repeat it in slightly different words: For every term that occurs in the total-probability sum, there are \(\left(n-1\right)!\) permutations of the same factors that leave the heads positions occupied by heads and the tails position occupied by tails.

    Equivalently, for every assignment of \(n-1\) distinguishable objects to one of two categories, there are \(\left(n-1\right)!\) permutations of these objects. There are \(C\left(n-1,1\right)\) such assignments. Accordingly, there are a total of \(\left(n-1\right)!C\left(n-1,1\right)\) permutations of the \(n\) distinguishable objects. Since we also know that the total number of permutations of n distinguishable objects is \(n!\), we have

    \[n!=\left(n-1\right)!C\left(n-1,1\right) \nonumber \]

    so that \[C\left(n-1,1\right)=\frac{n!}{\left(n-1\right)!} \nonumber \]

    which is the same result that we obtained by our first and more obvious method.

    The distinguishable objects within a category in a particular assignment can be permuted. We give these within-category permutations another name; we call them indistinguishable permutations. (This terminology reflects our intended application, which is to find the number of ways \(n\) identical molecules can be assigned to a set of energy levels. We can tell two isolated molecules of the same substance apart only if they have different energies. We can distinguish molecules in different energy levels from one another. We cannot distinguish two molecules in the same energy level from one another. Two different permutations of the molecules within any one energy level are indistinguishable from one another.) For every term in the expanded representation of the total probability sum, indistinguishable permutations can be obtained by exchanging \(P_H\) factors with one another, or by exchanging \(P_T\) factors with one another, but not by exchanging \(P_H\) factors with \(P_T\) factors. That is, heads are exchanged with heads; tails are exchanged with tails; but heads are not exchanged with tails.

    Now we can consider the general case. We let \(C\left(n_H,n_T\right)\) be the number of terms in the total-probability sum in which there are \(n_H\) heads and \(n_T\) tails. We want to find the value of \(C\left(n_H,n_T\right)\). Let’s suppose that one of the terms with \(n_H\) heads and \(n_T\) tails is

    \[\left(P_{H,a}P_{H,b}\dots P_{H,m}\right)\left(P_{T,r}P_{T,s}\dots P_{T,z}\right) \nonumber \]

    where there are \(n_H\) indices in the set \(\{a,\ b,\ \dots ,m\}\) and \(n_T\) indices in the set \(\{r,s,\dots ,z\}\). There are \(n_H!\) ways to order the heads outcomes and \(n_T!\) ways to order the tails outcomes. So, there are \(n_H!n_T!\) possible ways to order \(n_H\) heads and \(n_T\) tails outcomes. This is true for any sequence in which there are \(n_H\) heads and \(n_T\) tails; there will always be \(n_H!n_T!\) permutations of \(n_H\) heads and \(n_T\) tails, whatever the order in which the heads and tails appear. This is also true for every term in the total-probability sum that contains \(n_H\) heads factors and \(n_T\) tails factors. The number of such terms is \(C\left(n_H,n_T\right)\). For every such term, there are \(n_H!n_T!\) permutations of the same factors that leave the heads positions occupied by heads and the tails positions occupied by tails.

    Accordingly, there are a total of \(n_H!n_T!C\left(n_H,n_T\right)\) permutations of the \(n\) distinguishable objects. The total number of permutations of n distinguishable objects is \(n!\), so that

    \[n!=n_H!n_T!C\left(n_H,n_T\right) \nonumber \]

    and

    \[C\left(n_H,n_T\right)=\frac{n!}{n_H!n_T!} \nonumber \]

    Equivalently, we can construct a sum of terms, \(R\), in which the terms are all of the \(n!\) permutations of \(P_{H,i}\) factors for \(n_H\) heads and \(P_{T,j}\) factors for \(n_T\) tails. The value of each term in \(R\) is \(P^{n_H}_HP^{n_T}_T\). So we have

    \[R=n!P^{n_H}_HP^{n_T}_T \nonumber \]

    \(R\) contains all \(C\left(n_H,n_T\right)\) of the \(P^{n_H}_HP^{n_T}_T\)-valued terms that appear in the total-probability sum. For each of these \(P^{n_H}_HP^{n_T}_T\)-valued terms there are \(n_H!n_T!\) indistinguishable permutations that leave heads positions occupied by heads and tails positions occupied by tails. \(R\) will also contain all of the \(n_H!n_T!\) permutations of each of these \(P^{n_H}_HP^{n_T}_T\)-valued terms. That is, every term in \(R\) is either a term in the expanded representation of the total probability sum or an indistinguishable permutation of such a term. It follows that \(R\) is also given by

    \[R=n_H!n_T!C\left(n_H,n_T\right)P^{n_H}_HP^{n_T}_T \nonumber \]

    Equating these equations for R, we have

    \[n!P^{n_H}_HP^{n_T}_T=n_H!n_T!C\left(n_H,n_T\right)P^{n_H}_HP^{n_T}_T \nonumber \]

    and, again,

    \[C\left(n_H,n_T\right)=\frac{n!}{n_H!n_T!} \nonumber \]

    In summary: The total number of permutations is \(n!\) The number of combinations of \(n\) distinguishable things in which \(n_H\) of them are assigned to category \(H\) and \(n_T=n-n_H\) are assigned to category \(T\) is \(C\left(n_H,n_T\right)\). (Every combination is a distinguishable permutation.) The number of indistinguishable permutations of the objects in each such combination is \(n_H!n_T!\). The relationship among these quantities is

    total number of permutations = (number of distinguishable combinations)\({}_{\ }\)\({}_{\times }\) (number of indistinguishable permutations for each distinguishable combination)

    We noted earlier that \(C\left(n_H,n_T\right)\) is the formula for the binomial coefficients. If we do not care about the order in which the heads and tails arise, the probability of tossing \(n_T\) tails and \(n_H=n-n_T\) heads is

    \[C\left(n_H,n_T\right)P^{n_H}_HP^{n_T}_T=\left(\frac{n!}{n_H!n_T!}\right)P^{n_H}_HP^{n_T}_T \nonumber \]

    and the sum of such terms for all \(n+1\) possible values of \(n_T\) in the interval \(0\le n_T\le n\) is the total probability for all possible outcomes from \(n\) tosses of a coin. This total probability must be unity. That is, we have

    \[1={\left(P_H+P_T\right)}^n=\sum^n_{n_T=0}{C\left(n_H,n_T\right)P^{n_H}_HP^{n_T}_T}=\sum^n_{n_T=0}{\left(\frac{n!}{n_H!n_T!}\right)P^{n_H}_HP^{n_T}_T} \nonumber \]

    For an unbiased coin, \(P_H=P_T={1}/{2}\), and \(P^{n_H}_HP^{n_T}_T={\left({1}/{2}\right)}^n\), for all \(n_T\). This means that the probability of tossing \(n_H\) heads and \(n_T\) tails is proportional to \(C\left(n_H,n_T\right)\) where the proportionality constant is \({\left({1}/{2}\right)}^n\). The probability of \(n^{\blacksquare }\) heads and \(n-n^{\blacksquare }\) tails is the same as the probability of \(n-n^{\blacksquare }\) heads and \(n^{\blacksquare }\) tails.

    Nothing in our development of the equation for the total probability requires that we set \(P_H=P_T\), and in fact, the binomial probability relationship applies to any situation in which there are repeated trials, where each trial has two possible outcomes, and where the probability of each outcome is constant. If \(P_H\neq P_T\), the symmetry observed for tossing coins does not apply, because

    \[P^{n-n^{\blacksquare }}_HP^{n^{\blacksquare }}_T\neq P^{n^{\blacksquare }}_HP^{n-n^{\blacksquare }}_T \nonumber \]

    This condition corresponds to a biased coin.

    Another example is provided by a spinner mounted at the center of a circle painted on a horizontal surface. Suppose that a pie-shaped section accounting for \(25\%\) of the circle’s area is painted white and the rest is painted black. If the spinner’s stopping point is unbiased, it will stop in the white zone with probability \(P_W=0.25\) and in the black zone with probability \(P_B=0.75\). After \(n\) spins, the probability of \(n_W\) white outcomes and \(n_B\) black outcomes is

    \[\left(\frac{n!}{n_W!n_B!}\right){\left(0.25\right)}^{n_W}{\left(0.75\right)}^{n_B} \nonumber \]

    After \(n\) spins, the sum of the probabilities for all possible combinations of white and black outcomes is

    \[\begin{align*} 1 &={\left(P_W+P_B\right)}^n=\sum^n_{n_B=0}{C\left(n_W,n_B\right)P^{n_W}_WP^{n_B}_B} \\[4pt] &=\sum^n_{n_B=0}{\left(\frac{n!}{n_W!n_B!}\right)P^{n_W}_WP^{n_B}_B} \\[4pt] &=\sum^n_{n_{B=0}}{\left(\frac{n!}{n_W!n_B!}\right){\left(0.25\right)}^{n_W}{\left(0.75\right)}^{n_B}} \end{align*} \nonumber \]


    This page titled 19.1: Distribution of Results for Multiple Trials with Two Possible Outcomes is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Paul Ellgen via source content that was edited to the style and standards of the LibreTexts platform.