Mathematics > Statistics > Probability Theory
Probability theory is a fundamental branch of mathematics that deals with the analysis of random phenomena. The central objects of study in probability theory are random variables, stochastic processes, and events, which are abstract mathematical representations of real-world outcomes and uncertainties.
At its core, probability theory provides a framework for quantifying the likelihood of various outcomes. This involves defining a probability space, which is composed of a sample space \( \Omega \), a \(\sigma\)-algebra \(\mathcal{F}\), and a probability measure \( P \). Formally, a probability space is denoted as \( (\Omega, \mathcal{F}, P) \).
Sample Space (\(\Omega\)): The set of all possible outcomes of a random experiment. For example, when tossing a fair coin, the sample space is \(\Omega = \{\text{Heads}, \text{Tails}\}\).
\(\sigma\)-Algebra (\(\mathcal{F}\)): A collection of subsets of \(\Omega\), including the empty set and \(\Omega\) itself, that is closed under complement and countable unions. These subsets are the events for which probabilities are assigned.
Probability Measure (\(P\)): A function that assigns a probability to each event in \(\mathcal{F}\). The probability measure must satisfy three axioms:
- Non-negativity: \( P(A) \geq 0 \) for all \( A \in \mathcal{F} \).
- Normalization: \( P(\Omega) = 1 \).
- Countable Additivity: For any countable sequence of disjoint events \( \{A_i\}{i=1}^{\infty} \subset \mathcal{F} \), \( P\left(\bigcup{i=1}^{\infty} A_i\right) = \sum_{i=1}^{\infty} P(A_i) \).
A random variable is a measurable function \( X: \Omega \to \mathbb{R} \) that assigns a real number to each outcome in the sample space. The distribution of a random variable describes how probabilities are assigned to intervals on the real line, and is often represented by a probability distribution function (PDF) for continuous random variables or a probability mass function (PMF) for discrete random variables.
The expected value (or mean) of a random variable \( X \), denoted \( \mathbb{E}[X] \), is a measure of the central tendency of the distribution of \( X \). For a discrete random variable, it is given by:
\[ \mathbb{E}[X] = \sum_{x \in \text{Range}(X)} x \cdot P(X = x), \]
and for a continuous random variable, it is:
\[ \mathbb{E}[X] = \int_{-\infty}^{\infty} x \cdot f_X(x) \, dx, \]
where \( f_X \) is the PDF of \( X \).
Variance, another important concept, measures the spread of a random variable’s values around the mean. It is defined as:
\[ \text{Var}(X) = \mathbb{E}[(X - \mathbb{E}[X])^2]. \]
Probability theory also delves into the study of stochastic processes, which are collections of random variables indexed by time or space. Common examples include Markov chains, Poisson processes, and Brownian motion, each of which models different types of random behavior over time.
Understanding probability theory is essential for many fields, including statistics, finance, engineering, and the natural and social sciences. It provides the mathematical underpinning for making inferences from data, assessing risks, and modeling the uncertainty inherent in various phenomena.