Week 1
Week 1
Week 1
3. Sample space: A sample space is a set that contains all outcomes of an experiment.
• Sample space is a set, typically denoted S of an experiment.
• example: Toss a coin: S = { heads, tails }
5. Disjoint events: Two events with an empty intersection are said to be disjoint events.
P (φ) = 0
P (E C ) = 1 − P (E)
P (F ) = P (E) + P (F \ E)
⇒ P (E) ≤ P (F )
P (E) = P (E ∩ F ) + P (E \ F )
P (F ) = P (E ∩ F ) + P (F \ E)
P (E ∪ F ) = P (E) + P (F ) − P (E ∩ F )
13. Equally likely events: assign the same probability to each outcome.
Page 2
15. Conditional probability space: Consider a probability space (S, E, P ), where S
represents the sample space, E represents the collection of events, and P represents
the probability function.
• Let B be an event in S with P (B) > 0. Now, conditional probability space given
B is defined as
For any event A in the original probability space (P, S, E), the conditional prob-
P (A ∩ B)
ability of A given B is .
P (B)
• It is denoted by P (A | B). And
P (A ∩ B) = P (B)P (A | B)
• If the events B and B c partitioned the sample space S such that P (B1 ), P (B2 ) 6= 0,
then for any event A of S,
17. Bayes’ theorem: Let A and B are two events such that P (A) > 0, P (B) > 0.
P (A ∩ B) = P (B)P (A | B) = P (A)P (B | A)
P (B)P (A | B)
⇒ P (B | A) =
P (A)
In general, if the events B1 , B2 , · · · , Bk partition S such that P (Bi ) 6= 0 for i =
1, 2, · · · , k, then for any event A in S such that P (A) 6= 0,
P (Br )P (A | Br )
P (Br | A) = k
P
P (Bi )P (A | Bi )
i=1
for r = 1, 2, · · · , k.
18. Independence of two events: Two events A and B are independent iff
P (A ∩ B) = P (A)P (B)
Page 3
• Disjoint events are never independent.
• A and B independent ⇒ A and B c are independent.
• A and B independent ⇒ Ac and B c are independent.
19. Mutual independence of three events: Events A, B, and C are mutually indepen-
dent if
(a) P (A ∩ B) = P (A)P (B)
(b) P (A ∩ C) = P (A)P (C)
(c) P (A ∩ B) = P (A)P (B)
(d) P (A ∩ B ∩ C) = P (A)P (B)P (C)
20. Mutual independence of multiple events: Events A1 , A2 , · · · ,n are mutually in-
dependent if, ∀i1 , i2 , · · · , ik ,
P (Ai1 ∩ Ai2 ∩ · · · Aik ∩) = P (Ai1 )P (Ai2 ) · · · P (Aik )
n events are mutually independent ⇒ any subset with or without complementing are
independent as well.
21. Occurrence of event A in a sample space is considered as success.
22. Non - occurrence of event A in a sample space is considered as failure.
23. Repeated independent trials:
(a) Bernoulli trials
• Single Bernoulli trial:
– Sample space is {success, failure} with P(success) = p.
– We can also write the sample space S as {0, 1}, where 0 denotes the
failure and 1 denotes the success with P (1) = p, P (0) = 1 − p.
This kind of distribution is denoted by Bernoulli(p).
• Repeated Bernoulli trials:
– Repeat a Bernoulli trial multiple times independently.
– For each of the trial, the outcome will be either 0 or 1.
(b) Binomial distribution: Perform n independent Bernoulli(p) trials.
• It models the number of success in n independent Bernoulli trials.
• Denoted by B(n, p).
• Sample space is {0, 1, · · · , n}.
• Probability distribution is given by
P (B(n, p) = k) = nCk pk (1 − p)n−k
where n represents the total number trials and k represent the number of
success in n trials.
Page 4
• P (B = 0) + P (B = 1) + · · · + P (B = n) = 1
⇒ (1 − p)n + nC2 p2 (1 − p)n−2 + · · · + pn = 1.
(c) Geometric distribution: It models the number of failures the first success.
• Outcomes: Number of trials needed for first success and is denoted by G(p).
• Sample space: {1, 2, 3, 4, · · · }
• P (G = k) = P (first k − 1 trials result in 0 and kth trial result in 1.) =
(1 − p)k−1 p.
• Identity: P (G ≤ k) = 1 − (1 − p)k .
Page 5