1_Probability_Theory - its1021/Mathematics-Document GitHub Wiki
1.0 Introduction
Axioms of Probability
- $0 β€ P(E) β€ 1$
- $P(S) = 1$
- For any sequence of mutually exclusive events $πΈ_1,πΈ_2,\dots$ β $π(\bigcupπΈ_π) = \sumπ(πΈ_π)$
Propositions
- $P(S) = P(E \cup E^c) = 1$
- $E β F β P(E) β€ P(F)$
- $P(E_1 \cup E_2 \cup \dots \cup E_n) = \sum_{i=1}^{n}P(E_i) + (-1)^3\sum_{i<j}P(E_iE_j) + (-1)^4\sum_{i<j<k}P(E_iE_jE_k) + \dots + (-1)^{n+1}P(E_1E_2\dots E_n)$
Conditional Probability
- $P(E|F) = \frac{P(EF)}{P(F)} = \frac{P(EF)}{P(EF) + P(EF^c)} = \frac{P(F|E)P(E)}{P(EF) + P(EF^c)}$
- $P(E|F) = P(E)P(F) β E$ is independent of $F$
Property
-
Expectation(Mean) $E[a]=a, E[aX+bY+c] = aE[X] + bE[Y] + c$
-
Variance: Var(X)β₯0 $Var(a)=0, Var(aX+bY+c) = a^2Var(X) + b^2Var(Y)$ if $X$ and $Y$ are independent
-
Standard Deviation Ο = X-E[X]
1.1 Random Variable
Discrete RV
- PMF(Probability Mass Function): $p(x) = P(X=x) β₯ 0$
- $\sum_{x}p(x) = 1$
- JPMF(Joint PMF): $p(x,y) = P(X=x,Y=y) β₯ 0$
- $\sum_{x,y}p(x,y) = 1$
- Marginal PMF
- $E[g(X)] = \sum_{x:p(x)>0}g(x)p(x)$
- $Var(X) = Ο^2 = E[(π - E[π])^2] = E[π^2] - E[π]^2$
- $Var(aX+bY+c) = a^2Var(X) + b^2Var(Y)$ if $X$ and $Y$ are independent
Continuous RV
- PDF(Probability Density Function): $\int_a^b f(x)dx = P(aβ€Xβ€b) = P(a<X<b), f(x) β₯ 0$
- $\int_{-β}^{β}f(x)dx = 1$
- JPDF
- $E[g(X)] = \int_{-β}^{β}g(x)f(x)dx$
- $E[aX+bY+c] = aE[X] + bE[Y] + c$
- $Var(X) = Ο^2 = E[(π - E[π])^2] = E[π^2] - E[π]^2$
- $Var(aX+bY+c) = a^2Var(X) + b^2Var(Y)$ if $X$ and $Y$ are independent
1.2 Probability Distribution
Discrete Probability Distribution
Distribution | Values of $x$ | $p(x)$ | $E[X]$ | $Var(X)$ |
---|---|---|---|---|
BernoulliπβΌBer($p$) | Result of Bernoulli Trial$x$ = 0(failure),1(success) | $\ p^x (1 - p)^{1 - x}$ | $p$ | $p(1 - p)$ |
BinomialπβΌB($n,p$) | Number of successes in $n$ trials$xβ€n$ | $\binom{n}{x} p^x (1 - p)^{n - x}$$p(k+1) = \frac{p}{1-p}\frac{n-k}{k+1}p(k)$ | $np$ | $np(1 - p)$ |
PoissonπβΌPois($\lambda$) | Number of arrivals$x = 0,1,2,\dots$ | $\frac{e^{-\lambda} \lambda^x}{x!}$$p(k+1) = \frac{\lambda}{k+1}p(k)$ | $\lambda$ | $\lambda$ |
GeometricπβΌGeo($p$) | Trials to 1st success$x = 1,2,\dots$ | $(1 - p)^{x - 1} p$ | $\frac{1}{p}$ | $\frac{1 - p}{p^2}$ |
Negative BinomialπβΌNB($r,p$) | Trials to $π$-th success$x β₯ r$ | $\binom{x - 1}{r - 1} p^r (1 - p)^{x - r}$ | $\frac{r}{p}$ | $\frac{r(1 - p)}{p^2}$ |
HypergeometricπβΌHypergeo($n,N,m$) | Selection without replacement in m marked group in N samplemin(π₯) = max(0,πβ(π - π))max(π₯) = min(π,π) | $\frac{\binom{m}{x} \binom{N - m}{n - x}}{\binom{N}{n}}$ | $\frac{nm}{N}$ | $E[X][\frac{(n-1)(m-1)}{N-1}+1-E[X]]$$= \frac{nm}{N}[\frac{(n-1)(m-1)}{N-1}+1-\frac{nm}{N}]$$= \frac{nm(N - m)(N - n)}{N^2(N - 1)}$ |
- $Ber(π) = B(1,p)$
- $B(π,π) β Pois(Ξ»=ππ)$ iff $n β β, p β 0, np β Ξ»$
- $Geo(π) = NB(1,p)$
- $Hypergeo(n,N,m) β B(n,p)$ for success of selection m with replacement if $p= \frac{m}{n}$ , $m$ and $N$ are large in relation to $n$ and $x$
Continuous Probability Distribution
Distribution | Values of $x$ | $f(x)$ | $E[X]$ | $Var(X)$ |
---|---|---|---|---|
ExponentialπβΌExp($\lambda$) | Time between events | $\lambda e^{-\lambda x}$ | $\frac{1}{\lambda}$ | $\frac{1}{\lambda^2}$ |
UniformπβΌU($\alpha, \beta$) | Outcomes with equal density in $\alpha < x < \beta$ interval | $\frac{1}{\beta - \alpha}$ | $\frac{\alpha + \beta}{2}$ | $\frac{(\beta - \alpha)^2}{12}$ |
-
- πβΌπ(π,π2)
CDF(Cumulative Distribution Function): F(X)
JCDF(Joint CDF): F(x,y)=P(Xβ€x,Yβ€y)