Skip to main content
Chemistry LibreTexts

3.4: Applying the Laws of Probability

  • Page ID
    151929
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    The laws of probability apply to events that are independent. If the result of one trial depends on the result of another trial, we may still be able to use the laws of probability. However, to do so, we must know the nature of the interdependence.

    If the activity associated with event C precedes the activity associated with event D, the probability of D may depend on whether C occurs. Suppose that the first activity is tossing a coin and that the second activity is drawing a card from a deck; however, the deck we use depends on whether the coin comes up heads or tails. If the coin is heads, we draw a card from an ordinary deck; if the coin is tails, we draw a coin from a deck with the face cards removed. Now we ask about the probability of drawing an ace. If the coin is heads, the probability of drawing an ace is \({4}/{52}={1}/{13}\). If the coin is tails, the probability of drawing an ace is \({4}/{40}={1}/{10}\). The combination coin is heads and card is ace has probability: \(\left({1}/{2}\right)\left({1}/{13}\right)={1}/{26}\). The combination coin is tails and card is ace has probability \(\left({1}/{2}\right)\left({1}/{10}\right)={1}/{20}\). In this

    case, the probability of drawing an ace depends on the modification we make to the deck based on the outcome of the coin toss.

    Applying the laws of probability is straightforward. An example that illustrates the application of these laws in a transparent way is provided by villages First, Second, Third, and Fourth, which are separated by rivers. (See Figure 1.) Bridges \(1\), \(2\), and \(3\) span the river between First and Second. Bridges \(a\) and \(b\) span the river between Second and Third. Bridges \(A\), \(B\), \(C\), and \(D\) span the river between Third and Fourth. A traveler from First to Fourth who is free to take any route he pleases has a choice from among \(3\times 2\times 4=24\) possible combinations. Let us consider the probabilities associated with various events:

    • There are 24 possible routes. If a traveler chooses his route at random, the probability that he will take any particular route is \({1}/{24}\). This illustrates our assumption that each event in a set of \(N\) exhaustive and mutually exclusive events occurs with probability \({1}/{N}\).
    • If he chooses a route at random, the probability that he goes from First to Second by either bridge \(1\) or bridge \(2\) is \(P\left(1\right)+P\left(2\right)=\ {1}/{3}+{1}/{3}={2}/{3}\). This illustrates the calculation of the probability of alternative events.
    • The probability of the particular route \(2\to a\to C\) is \(P\left(2\right)\times P\left(a\right)\times P\left(C\right)=\left({1}/{3}\right)\left({1}/{2}\right)\left({1}/{4}\right)={1}/{24}\), and we calculate the same probability for any other route from First to Fourth. This illustrates the calculation of the probability of a compound event.
    • If he crosses bridge \(1\), the probability that his route will be \(2\to a\to C\) is zero, of course. The probability of an event that has already occurred is 1, and the probability of any alternative is zero. If he crosses bridge \(1,\) \(P\left(1\right)=1\), and \(P\left(2\right)=P\left(3\right)=0\).
    • Given that a traveler has used bridge \(1\), the probability of the route \(1\to a\to C\) becomes the probability of path \(a\to C\), which is \(P\left(a\right)\times P\left(C\right)=\left({1}/{2}\right)\left({1}/{4}\right)={1}/{8}\). Since \(P\left(1\right)=1\), the probability of the compound event \(1\to a\to C\) is the probability of the compound event \(a\to C\).

    The outcomes of rolling dice, rolling provide more illustrations. If we roll two dice, we can classify the possible outcomes according to the sums of the outcomes for the individual dice. There are thirty-six possible outcomes. They are displayed in Table 1.

    Table 1: Outcomes from tossing two dice

    Outcome for first die
    Outcome for second die 1 2 3 4 5 6
    1 2 3 4 5 6 7
    2 3 4 5 6 7 8
    3 4 5 6 7 8 9
    4 5 6 7 8 9 10
    5 6 7 8 9 10 11
    6 7 8 9 10 11 12

    Let us consider the probabilities associated with various dice-throwing events:

    • The probability of any given outcome, say the first die shows \(2\) and the second die shows \(3\), is \({1}/{36}\).
    • Since the probability that the first die shows \(3\) while the second die shows \(2\) is also \({1}/{36}\), the probability that one die shows \(2\) and the other shows \(3\) is \[P\left(3\right)\times P\left(2\right)+P\left(2\right)\times P\left(3\right) =\left({1}/{36}\right)+\left({1}/{36}\right) ={1}/{18}. \nonumber \]
    • Four different outcomes correspond to the event that the score is \(5\). Therefore, the probability of rolling \(5\) is \[P\left(1\right)\times P\left(4\right)+P\left(2\right)\times P\left(3\right) +P\left(3\right)\times P\left(2\right)+P\left(4\right)\times P\left(1\right) ={1}/{9} \nonumber \]
    • The probability of rolling a score of three or less is the probability of rolling \(2\), plus the probability of rolling \(3\) which is \(\left({1}/{36}\right)+\left({2}/{36}\right)={3}/{36}={1}/{12}\)
    • Suppose we roll the dice one at a time and that the first die shows \(2\). The probability of rolling \(7\) when the second die is thrown is now \({1}/{6}\), because only rolling a \(5\) can make the score 7, and there is a probability of \({1}/{6}\) that a \(5\) will come up when the second die is thrown.
    • Suppose the first die is red and the second die is green. The probability that the red die comes up \(2\) and the green die comes up \(3\) is \(\left({1}/{6}\right)\left({1}/{6}\right)={1}/{36}\).

    Above we looked at the number of outcomes associated with a score of \(3\) to find that the probability of this event is \({1}/{18}\). We can use another argument to get this result. The probability that two dice roll a score of three is equal to the probability that the first die shows \(1\) or \(2\) times the probability that the second die shows whatever score is necessary to make the total equal to three. This is:

    \[\begin{align*} P\left(first\ die\ shows\ 1\ or\ 2\right)\times \left({1}/{6}\right) &= \left[\left({1}/{6}\right)+\left({1}/{6}\right)\right]\times {1}/{6} \\[4pt] &={2}/{36} \\[4pt]& ={1}/{18} \end{align*} \]

    Application of the laws of probability is frequently made easier by recognizing a simple restatement of the requirement that events be mutually exclusive. In a given trial, either an event occurs or it does not. Let the probability that an event A occurs be \(P\left(A\right)\). Let the probability that event A does not occur be \(P\left(\sim A\right)\). Since in any given trial, the outcome must belong either to event A or to event \(\sim A\), we have

    \[P\left(A\right)+P\left(\sim A\right)=1 \nonumber \]

    For example, if the probability of success in a single trial is \({2}/{3}\), the probability of failure is \({1}/{3}\). If we consider the outcomes of two successive trials, we can group them into four events.

    • Event SS: First trial is a success; second trial is a success.
    • Event SF: First trial is a success; second trial is a failure.
    • Event FS: First trial is a failure; second trial is a success.
    • Event FF: First trial is a failure; second trial is a failure.

    Using the laws of probability, we have

    \[ \begin{align*} 1 &=P\left(Event\ SS\right)+P\left(Event\ SF\right)+P\left(Event\ FS\right)+\ P(Event\ FF) \\[4pt] &=P_1\left(S\right)\times P_2\left(S\right)+P_1\left(S\right)\times P_2\left(F\right) +P_1(F)\times P_2(S)+P_1(F)\times P_2(F) \end{align*} \]

    where \(P_1\left(X\right)\) and \(P_2\left(X\right)\) are the probability of event \(X\) in the first and second trials, respectively.

    This situation can be mapped onto a simple diagram. We represent the possible outcomes of the first trial by line segments on one side of a unit square \(P_1\left(S\right)+P_1\left(F\right)=1\). We represent the outcomes of the second trial by line segments along an adjoining side of the unit square. The four possible events are now represented by the areas of four mutually exclusive and exhaustive portions of the unit square as shown in Figure 2.

    clipboard_ebfc59614d9a602ea7b4072e483453398.png
    Figure 2. Success and failure in successive trials.

    This page titled 3.4: Applying the Laws of Probability is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Paul Ellgen via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.