Baye s Law for Continuous Random Variables
Bayes Theorem
Bayes theorem is a theorem in probability and statistics, named after the Reverend Thomas Bayes, that helps in determining the probability of an event that is based on some event that has already occurred. Bayes theorem has many applications such as bayesian interference, in the healthcare sector - to determine the chances of developing health problems with an increase in age and many others. Here, we will aim at understanding the use of the Bayes theorem in determining the probability of events, its statement, formula, and derivation with the help of examples.
1. | What is Bayes Theorem? |
2. | Proof of Bayes Theorem |
3. | Bayes Theorem Formula |
4. | Difference between Conditional Probability and Bayes Theorem |
5. | Terms Related to Bayes Theorem |
8. | FAQs on Bayes Theorem |
What is Bayes Theorem?
Bayes theorem, in simple words, determines the conditional probability of an event A given that event B has already occurred. Bayes theorem is also known as the Bayes Rule or Bayes Law. It is a method to determine the probability of an event based on the occurrences of prior events. It is used to calculate conditional probability. Bayes theorem calculates the probability based on the hypothesis. Now, let us state the theorem and its proof. Bayes theorem states that the conditional probability of an event A, given the occurrence of another event B, is equal to the product of the likelihood of B, given A and the probability of A. It is given as:
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\)
Here, P(A) = how likely A happens(Prior knowledge)- The probability of a hypothesis is true before any evidence is present.
P(B) = how likely B happens(Marginalization)- The probability of observing the evidence.
P(A/B) = how likely A happens given that B has happened(Posterior)-The probability of a hypothesis is true given the evidence.
P(B/A) = how likely B happens given that A has happened(Likelihood)- The probability of seeing the evidence if the hypothesis is true.
Bayes Theorem - Statement
The statement of Bayes Theorem is as follows: Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) have non-zero probability of occurrence and they form a partition of S. Let A be any event which occurs with \(E_{1} or E_{2} or E_{3} ...or E_{n}\), then according to Bayes Theorem,
\(P(E_{i} | A) = \dfrac{P(E_{i})P(A|E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})} , i=1,2,3,...,n\)
- Here E\(_i\) ∩ E\(_j\) = φ, where i ≠ j. (i.e) They are mutually exhaustive events
- The union of all the events of the partition, should give the sample space.
- 0 ≤ P(E\(_{i}\)) ≤ 1
Proof of Bayes Theorem
To prove the Bayes Theorem, we will use the total probability and conditional probability formulas. The total probability of an event A is calculated when not enough data is known about event A, then we use other events related to event A to determine its probability. Conditional probability is the probability of event A given that other related events have already occurred.
(E\(_{i}\)), be is a partition of the sample space S. Let A be an event that occurred. Let us express A in terms of (E\(_{i}\)).
A = A ∩ S
= A ∩ (\(E_{1}, E_{2}, E_{3}, ..., E_{n}\))
A = (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))
P(A) = P[(A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\)) ∪ (A ∩\(E_{1}\))....∪ ( A ∩\(E_{1}\))]
We know that when A and B are disjoint sets, then P(A∪B) = P(A) + P(B)
Thus here, P(A) = P(A ∩\(E_{1}\)) +P(A ∩\(E_{1}\))+ P(A ∩\(E_{1}\)).....P(A ∩\(E_{n}\))
According to the multiplication theorem of a dependent event, we have
P(A) = P(E). P(A/\(E_{1}\)) + P(E). P(A/\(E_{2}\)) + P(E). P(A/\(E_{3}\))......+ P(A/\(E_{n}\))
Thus total probability of P(A) = \(\sum_{i=1}^{n}P(E_{i})P(A|E_{i}) , i=1,2,3,...,n\) --- (1)
Recalling the conditional probability, we get
\(P(E_{i}|A) = \dfrac{P(E_{i}\cap A)}{P(A)} , i=1,2,3,...,n\) ---(2)
Using the formula for conditional probability of \(P(A|E_{i})\), we have
\(P(E_{i}\cap A) = P(A|E_{i}) P(E_{i})\) --- (3)
Substituting equations (1) and (3) in equation (2), we get
\(P(E_{i}|A) = \dfrac{P(A|E_{i}) P(E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})}, i=1,2,3,...,n\)
Hence, Bayes Theorem is proved.
Bayes Theorem Formula
Bayes theorem formula exists for events and random variables. Bayes Theorem formulas are derived from the definition of conditional probability. It can be derived for events A and B, as well as continuous random variables X and Y. Let us first see the formula for events.
Bayes Theorem Formula for Events
The formula for events derived from the definition of conditional probability is:
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}, P(B) \neq 0\)
Derivation:
According to the definition of conditional probability, \(P(A|B) = \dfrac{P(A \cap B)}{P(B)}, P(B) \neq 0\) and we know that \(P(A \cap B) = P(B \cap A) = P(B|A)P(A)\), which implies,
\(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\)
Hence, the Bayes theorem formula for events is derived.
Bayes Theorem for Continuous Random Variables
The formula for continuous random variables X and Y derived from the definition of the conditional probability of continuous variables is:
\(f_{X|Y=y}(x) = \dfrac{f_{Y|X=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Derivation:
According to the definition of conditional density or conditional probability of continuous random variables, we know that \(f_{X|Y=y}(x)=\dfrac{f_{X,Y}(x,y)}{f_{Y}(y)}\) and \(f_{Y|X=x}(y)=\dfrac{f_{X,Y}(x,y)}{f_{X}(x)}\), which implies,
\(f_{X|Y=y}(x) = \dfrac{f_{Y|X=x}(y)f_{X}(x)}{f_{Y}(y)}\)
Hence, the Bayes Theorem formula for random continuous variables is derived.
Difference Between Conditional Probability and Bayes Theorem
Conditional Probability | Bayes Theorem |
---|---|
Conditional Probability is the probability of an event A that is based on the occurrence of another event B. | Bayes Theorem is derived using the definition of conditional probability. The Bayes theorem formula includes two conditional probabilities. |
Formula: \(P(A|B) = \dfrac{P(A \cap B)}{P(B)}\) | Formula: \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\) |
As we have studied about Bayes theorem in detail, let us understand the meanings of a few terms related to the concept which have been used in the Bayes theorem formula and derivation:
- Conditional Probability - Conditional Probability is the probability of an event A based on the occurrence of another event B. It is denoted by P(A|B) and represents the probability of A given that event B has already happened.
- Joint Probability - Joint probability measures the probability of two more events occurring together and at the same time. For two events A and B, it is denoted by \(P(A \cap B)\).
- Random Variables - Random variable is a real-valued variable whose possible values are determined by a random experiment. The probability of such variables is also called the experimental probability.
- Posterior Probability - Posterior probability is the probability of an event that is calculated after all the information related to the event has been accounted for. It is also known as conditional probability.
- Prior Probability - Prior probability is the probability of an event that is calculated before considering the new information obtained. It is the probability of an outcome that is determined based on current knowledge before the experiment is performed.
Important Notes on Bayes Theorem
- Bayes theorem is used to determine conditional probability.
- When two events A and B are independent, P(A|B) = P(A) and P(B|A) = P(B)
- Conditional probability can be calculated using the Bayes theorem for continuous random variables.
☛ Also Check:
- Probability and statistics
Bayes Theorem Examples
go to slidego to slidego to slide
Great learning in high school using simple cues
Indulging in rote learning, you are likely to forget concepts. With Cuemath, you will learn visually and be surprised by the outcomes.
Book a Free Trial Class
Practice Questions on Bayes Theorem
go to slidego to slide
FAQs on Bayes Theorem
What Is Bayes Theorem in Statistics?
Bayes theorem is a statistical formula to determine the conditional probability of an event. It describes the probability of an event based on prior knowledge of events that have already happened. Bayes Theorem is named after the Reverend Thomas Bayes and its formula for random events is \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\)
Here, P(A) = how likely A happens
P(B) = how likely B happens
P(A/B) = how likely does A happen given that B has happened
P(B/A) = how likely does B happen given that A has happened
What Does the Bayes Theorem State?
Let \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) be a set of events associated with a sample space S, where all events \(E_{1}, E_{2}, E_{3}, ..., E_{n}\) have non-zero probability of occurrence and they form a partition of S. Let A be any event associated with S, then according to Bayes Theorem,
\(P(E_{i} | A) = \dfrac{P(E_{i})P(A|E_{i})}{\sum_{k=1}^{n}P(E_{k})P(A|E_{k})} , i=1,2,3,...,n\)
How to Use Bayes Theorem?
To determine the probability of an event A given that the related event B has already occurred, that is, P(A|B) using the Bayes Theorem, we calculate the probability of the event B, that is, P(B); the probability of event B given that event A has occurred, that is, P(B|A); and the probability of the event A individually, that is, P(A). Then, we substitute these values into the formula \(P(A|B) = \dfrac{P(B|A)P(A)}{P(B)}\) to determine the probability using the Bayes Theorem.
Is Bayes Theorem for Independent Events?
If two events A and B are independent, then P(A|B) = P(A) and P(B|A) = P(B), therefore Bayes theorem cannot be used here to determine the conditional probability as we need to determine the total probability and there is no dependency of events.
Is Conditional Probability the Same as Bayes Theorem?
Conditional probability is the probability of the occurrence of an event based on the occurrence of other events whereas the Bayes theorem is derived from the definition of conditional probability. Bayes theorem includes the two conditional probabilities.
What Is the Bayes Theorem in Machine Learning?
Bayes theorem provides a method to determine the probability of a hypothesis based on its prior probability, the probabilities of observing various data given the hypothesis, and the observed data itself. It helps immensely in getting a more accurate result. Hence, whenever there is a conditional probability problem, the Bayes Theorem in Machine Learning is used.
Source: https://www.cuemath.com/data/bayes-theorem/
0 Response to "Baye s Law for Continuous Random Variables"
Post a Comment