CharacteristicFunction - crowlogic/arb4j GitHub Wiki

A characteristic function is a mathematical concept used in probability theory and statistics to completely describe the probability distribution of a random variable. It is a complex-valued function that encodes information about the moments of the distribution. The characteristic function is particularly useful in deriving distribution properties and in solving problems involving sums of independent random variables.

Given a random variable $X$, the characteristic function $\phi_X(t)$ is defined as the expected value of $e^{itX}$, where $i$ is the imaginary unit and $t$ is a real number:

$$ \phi_X(t) = \mathbb{E}[e^{itX}] $$

For continuous random variables with probability density function (pdf) $f_X(x)$, the characteristic function can be expressed as:

$$ \phi_X(t) = \int_{-\infty}^{\infty} e^{itx} f_X(x) dx $$

For discrete random variables with probability mass function (pmf) $p_X(x)$, the characteristic function is given by:

$$ \phi_X(t) = \sum_{x} e^{itx} p_X(x) $$

The inverse relationship, which allows us to obtain the distribution of the random variable from its characteristic function, is given by the inverse Fourier transform. For continuous random variables, this is expressed as:

$$ f_X(x) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{-itx} \phi_X(t) dt $$

Characteristic functions have several useful properties, such as:

  1. Linearity: For any random variables $X$ and $Y$ and constants $a$ and $b$, the characteristic function of the linear combination $Z = aX + bY$ is given by:

$$ \phi_Z(t) = \phi_{aX + bY}(t) = \phi_X(at) \phi_Y(bt) $$

  1. Convolution theorem: If $X$ and $Y$ are independent random variables, the characteristic function of their sum $Z = X + Y$ is the product of their characteristic functions:

$$ \phi_Z(t) = \phi_{X + Y}(t) = \phi_X(t) \phi_Y(t) $$

These properties make characteristic functions a powerful tool in the analysis of probability distributions and the study of sums of random variables.