6.4: Conditional Probability and Independent Events
Focus on the modeling advice and steps 1 - 4 on pages 1 - 9.
Read this lecture. Try to understand the concepts of conditional probability, and just scan the examples. We'll take a closer look at the problems in the next section.
6.4.1: Computing a Conditional Probability
Work through the following examples in this lecture:
Section 1, "The Halting Problem" (fictitious name of a hockey team) on page 2;
Section 2.1, "A Coin Problem" on page 6;
Section 2.2, "A Variant of the Two Coins Problem" on page 8;
Section 3, "Medical Testing" on page 9;
Section 4.1, "Carnival Dice" on page 11;
Section 4.3, "Discrimination Lawsuit" on page 14; and
Section 4.4, "On-Time Airlines" on page 15.
Understanding the examples is critical to really understanding conditional probability.
6.4.2: Bayes's Theorem
Read Section 3 on pages DT-28 through DT-35.
Bayes' theorem is a famous theorem for computing conditional probabilities. Assume Ei are mutually exclusive events for i = 1,..., n and Ui Ei = D, for arbitrary event D, P(Ei | D) = P(D | Ei ) P(Ei) / [P(D|E1)P( E1) + ....+ P(D|En) P(En)]. Statements of Bayes' theorem are given on pages DT-28 and DT-32. Like the binomial theorem, Bayes' theorem is very useful in calculating probabilities for many applications, for example, in diagnosis and in decision theory.
6.4.3: Computing the Probability of Independent Events
Read this lecture for practice with the probability of independent events.