Conditional Probability
Let A and B be two events associated with a random experiment. Then the probability of the occurrence of A under the condition that B has already occurred and P (B) ≠ 0, is called the conditional probability of A given B and is written as P (A / B). Formula to evaluate P (A / B) : P (A / B)=P(A ∩ B)/P(B) P (B) ≠ 0.
Similarly, P (B / A)=P(A ∩ B)/P(A)P (A) ≠ 0
Multiplication Theorem On Probability
Independent Events
If A and B are independent events, then P(A/B) is precisely the same as P(A) since A is not affected by B, i.e., P(A/B) = P(A). Similarly, P(B/A) = P(B). Now by
multiplication rule of probability, we have P (A ∩ B) = P(A). P(B/A) = P(B). P(A/B) P (A ∩ B) = P(A). P(B) (Known as multiplication rule for independent events)
Note
So, any one of these three conditions may be used as a test for independence of events :
(a) P (A.B) = P(A). P(B)
(b) P(A/B) = P(A)
(c) P(B/A) = P(B)
Total Probability
Theorem:
Let S be the sample space and E1, E2, …., En be n mutually exclusive and exhaustive events associated with a random experiment. Let A be any event associated with S, i.e., which occurs with E1 or E2 or …. or En, then P(A) = P(E1) P(A/E1) + P(E2) P(A/E2) +……+ P(En) P(A/En)]
$$\textstyle\sum^n_{\text{i=1}} \text{P(E}_\text{i})\space \text{P}(\text{A}/\text{E}_\text{i})$$
Note
The rule of total probability can be depicted by the tree diagram shown above.
Bayes’ Theorem
$$\text{P(\text{E}}_\text{i}/\text{A})=\frac{\text{P(A}\cap\text{E}_\text{i})}{\text{P}(\text{A}\cap\text{E}_1)+(\text{P}\cap\text{E}_2)+…+\text{P}(\text{A}\cap \text{E}_n)}\\\qquad\text{We know that P(A}\space \cap \text{E}_\text{i})=\text{P(E}_\text{i})\text{P}(\text{A/E}_\text{i})\\\qquad \text{P(E}_\text{i}/\text{A}) =\frac{\text{P(E}_\text{i})\text{P(A/E}_\text{i})}{\text{P(E}_1)\text{P(A / E}_1 ) +\text{P(E}_2)\text{P(A / E}_2)+ . . .+ \text{P(E}_n)\text{P(A / E}_n)}\\\qquad\qquad=\frac{\text{P(E}_\text{i})\text{P(A / E}_\text{i})}{\sum \text{P(E}_\text{i})\text{P(A / E}_\text{i})}$$
Random Variable And Its Probability Distribution
Random variable:
A random variable is a real valued function X defined over entire sample space of an experiment i.e., it is a function which associates to each point of the sample space of an experiment, a unique real number.
Probability distribution table:
X | X1 | X2 | X3 | ........ | Xn |
P(X) | p1 | p2 | p3 | ........ | pn |
Angle Between Two Lines
Let X be a random variable whose possible values x1 , x2 , x3 ,....,xn occur with probablities p1 , p2 , p3 ,....,pn, respectively. The mean of X, denoted by μ, is the numbe
$$\sum^n_{i=1}\text{x}_i\text{p}_i\space \text{i.e.},$$
the mean of X is the weighted average of the possible values of X each value being weighted by its probablity with which it occurs. The mean of a random variable X is also called the expectation of X, denoted by E(X). Thus, E(X) = μ = xi pi = x1p1 + x2p2 +.......xnpn In other words, the mean or expectation of a random variable X is the sum of the products of all possible values of X by their respective probabilities.
Mathematics Most Likely Question Bank
CBSE Class 12 for 2025 Exam