Baesyan Inference - rFronteddu/general_wiki GitHub Wiki
Baesyan Inference
Conditional Probability
- Events are independent when their probability is unaffected by other events.
- ex. when flipping a coin: you always have 50% of getting the tail.
- Events are dependent when their probability changes as conditions change.
- P(A|B) is called Conditional Probability and expresses the likelihood of an event occurring, assuming a different one has already happened.
- If A and B are independent: P(A|B) = P(A)P(B)/P(B) = P(A)
- If A and B are dependent: P(A|B) = P(A)∩P(B)/P(B)
Considering the formula favorable outcomes/sample space, for A|B to happen, B must happen (P(A)∩P(B)) and the sample space is all B since we know that B is given.
It is also important to observe that P(A|B) is generally different from P(B|A)
The Law of Total Probability
- If A = B1∪B2∪B3∪…∪Bn
- P(A) = B1P(A|B1) + B2P(A|B2) + … + Bn*P(A|Bn)
- ex. the probability of being vegan is the probability of being a vegan man + the probability of being a vegan woman
Additive Law
- AUB = A+B - A∩B otherwise, we would count the intersection twice.
- ⇒ P(A∩B) = P(A) + P(B) - AUB
Multiplication Rule
- Since P(A|B) = P(A)∩P(B)/P(B)
- and considering the multiplication rule
- ⇒ P(A)∩P(B) = P(A|B)*P(B)
- ex. if you know that when B happens P(B)=0.5, P(A|B) = 0.8,
- then you can obtain P(A)∩P(B) = 0.5*0.8
Bayes's Law
- Take two events A and B
- Considering that P(A|B) = P(A)∩P(B)/P(B)
- Using the Multiplication rule we get that P(A|B) = P(B|A)*P(A)/P(B)
This rule can be used to find a relationship between different conditional probabilities of two events.