Probability Class 12 Notes Maths Chapter 13 - CBSE

Chapter:13

What are Probability ?

Conditional Probability

Let A and B be two events associated with a random experiment. Then the probability of the occurrence of A under the condition that B has already occurred and P (B) ≠ 0, is called the conditional probability of A given B and is written as P (A / B). Formula to evaluate P (A / B) : P (A / B)=P(A ∩ B)/P(B) P (B) ≠ 0.

Similarly, P (B / A)=P(A ∩ B)/P(A)P (A) ≠ 0

Multiplication Theorem On Probability

Let A and B be two events associated with a sample space S. Then, P (A ∩ B) = P(A). P(B/A) = P(B). P(A/B), provided P (A)  and P(B)  If A, B and C are three events associated with a random experiment, then P (A ∩ B ∩ C) = P(A). P(B/A). P(C/A ∩ B) Similarly, if A1, A2............., An are n events related to a random experiment, then P (A1 ∩ A2 ∩ A3 ∩ ............. An) = P(A1). P(A2/A1). P (A3/A1 ∩ A2) ............. P (An/ A1 ∩ A2 ∩ A3 ∩ ............. An – 1)

Independent Events

If A and B are independent events, then P(A/B) is precisely the same as P(A) since A is not affected by B, i.e., P(A/B) = P(A). Similarly, P(B/A) = P(B). Now by
multiplication rule of probability, we have P (A ∩ B) = P(A). P(B/A) = P(B). P(A/B) P (A ∩ B) = P(A). P(B) (Known as multiplication rule for independent events)

Note

So, any one of these three conditions may be used as a test for independence of events :
(a) P (A.B) = P(A). P(B)
(b) P(A/B) = P(A)
(c) P(B/A) = P(B)

Total Probability

Theorem:

Let S be the sample space and E1, E2, …., En be n mutually exclusive and exhaustive events associated with a random experiment. Let A be any event associated with S, i.e., which occurs with E1 or E2 or …. or En, then P(A) = P(E1) P(A/E1) + P(E2) P(A/E2) +……+ P(En) P(A/En)]

$$\textstyle\sum^n_{\text{i=1}} \text{P(E}_\text{i})\space \text{P}(\text{A}/\text{E}_\text{i})$$

Note

The rule of total probability can be depicted by the tree diagram shown above.

Bayes’ Theorem

$$\text{P(\text{E}}_\text{i}/\text{A})=\frac{\text{P(A}\cap\text{E}_\text{i})}{\text{P}(\text{A}\cap\text{E}_1)+(\text{P}\cap\text{E}_2)+…+\text{P}(\text{A}\cap \text{E}_n)}\\\qquad\text{We know that P(A}\space \cap \text{E}_\text{i})=\text{P(E}_\text{i})\text{P}(\text{A/E}_\text{i})\\\qquad \text{P(E}_\text{i}/\text{A}) =\frac{\text{P(E}_\text{i})\text{P(A/E}_\text{i})}{\text{P(E}_1)\text{P(A / E}_1 ) +\text{P(E}_2)\text{P(A / E}_2)+ . . .+ \text{P(E}_n)\text{P(A / E}_n)}\\\qquad\qquad=\frac{\text{P(E}_\text{i})\text{P(A / E}_\text{i})}{\sum \text{P(E}_\text{i})\text{P(A / E}_\text{i})}$$

Random Variable And Its Probability Distribution

Random variable:

A random variable is a real valued function X defined over entire sample space of an experiment i.e., it is a function which associates to each point of the sample space of an experiment, a unique real number.

Probability distribution table:

Let a discrete random experiment assumes values x1, x2, x3, …xn with probabilities p1, p2, p3, …, pn respectively satisfying p1 + p2 + p3 + …+ pn = 1 and 0 ≤ pi ≤ 1, for i = 1, 2, 3, …, n and in this case the following table describes the probability distribution of X.
X X1 X2 X3 ........ Xn
P(X) p1 p2 p3 ........ pn
The different values of x1, x2, …, xn of X together with corresponding probabilities form a probability distribution of the random variable X.

Angle Between Two Lines

Let X be a random variable whose possible values x1 , x2 , x3 ,....,xn occur with probablities p1 , p2 , p3 ,....,pn, respectively. The mean of X, denoted by μ, is the numbe

$$\sum^n_{i=1}\text{x}_i\text{p}_i\space \text{i.e.},$$

the mean of X is the weighted average of the possible values of X each value being weighted by its probablity with which it occurs. The mean of a random variable X is also called the expectation of X, denoted by E(X). Thus, E(X) = μ = xi pi = x1p1 + x2p2 +.......xnpn In other words, the mean or expectation of a random variable X is the sum of the products of all possible values of X by their respective probabilities.