Fiveable
Fiveable

๐Ÿ“ŠActuarial Mathematics Unit 1 โ€“ Probability Theory & Distributions

Probability theory and distributions form the foundation of actuarial mathematics. These concepts help quantify uncertainty and model random events, crucial for pricing insurance policies and assessing financial risks. Understanding probability basics, random variables, and common distributions is essential for actuaries. Key applications in actuarial science include modeling claim frequencies and severities, calculating risk measures, estimating reserves, and assessing solvency. Actuaries use various probability distributions to analyze data, make predictions, and develop pricing models for insurance products and financial instruments.

Key Concepts and Terminology

  • Probability measures the likelihood of an event occurring and ranges from 0 (impossible) to 1 (certain)
  • Random variables assign numerical values to outcomes of a random experiment
    • Discrete random variables have countable outcomes (number of heads in 10 coin flips)
    • Continuous random variables have uncountable outcomes (time until next customer arrives)
  • Probability distributions describe the probabilities of different outcomes for a random variable
  • Expectation (mean) represents the average value of a random variable over many trials
  • Variance and standard deviation measure the dispersion or spread of a probability distribution
  • Independence implies that the occurrence of one event does not affect the probability of another event
  • Conditional probability measures the probability of an event given that another event has occurred
  • Bayes' theorem relates conditional probabilities and is used for updating probabilities based on new information

Probability Basics

  • Probability axioms define the properties that probability measures must satisfy
    • Non-negativity: P(A)โ‰ฅ0P(A) \geq 0 for any event AA
    • Normalization: P(ฮฉ)=1P(\Omega) = 1, where ฮฉ\Omega is the sample space (set of all possible outcomes)
    • Countable additivity: For mutually exclusive events A1,A2,โ€ฆA_1, A_2, \ldots, P(โ‹ƒi=1โˆžAi)=โˆ‘i=1โˆžP(Ai)P(\bigcup_{i=1}^{\infty} A_i) = \sum_{i=1}^{\infty} P(A_i)
  • Probability of the complement: P(Ac)=1โˆ’P(A)P(A^c) = 1 - P(A), where AcA^c is the complement of event AA
  • Probability of the union of two events: P(AโˆชB)=P(A)+P(B)โˆ’P(AโˆฉB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
  • Conditional probability: P(AโˆฃB)=P(AโˆฉB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}, where P(B)>0P(B) > 0
  • Independence: Events AA and BB are independent if P(AโˆฉB)=P(A)P(B)P(A \cap B) = P(A)P(B)
  • Multiplication rule: P(AโˆฉB)=P(A)P(BโˆฃA)P(A \cap B) = P(A)P(B|A)
  • Law of total probability: For a partition {B1,B2,โ€ฆ,Bn}\{B_1, B_2, \ldots, B_n\} of the sample space, P(A)=โˆ‘i=1nP(AโˆฃBi)P(Bi)P(A) = \sum_{i=1}^{n} P(A|B_i)P(B_i)

Random Variables

  • Random variables map outcomes of a random experiment to real numbers
  • Probability mass function (PMF) for a discrete random variable XX: pX(x)=P(X=x)p_X(x) = P(X = x)
    • Properties: pX(x)โ‰ฅ0p_X(x) \geq 0 and โˆ‘xpX(x)=1\sum_x p_X(x) = 1
  • Cumulative distribution function (CDF) for a random variable XX: FX(x)=P(Xโ‰คx)F_X(x) = P(X \leq x)
    • Properties: FX(x)F_X(x) is non-decreasing, right-continuous, limโกxโ†’โˆ’โˆžFX(x)=0\lim_{x \to -\infty} F_X(x) = 0, and limโกxโ†’โˆžFX(x)=1\lim_{x \to \infty} F_X(x) = 1
  • Probability density function (PDF) for a continuous random variable XX: fX(x)=FXโ€ฒ(x)f_X(x) = F_X'(x)
    • Properties: fX(x)โ‰ฅ0f_X(x) \geq 0 and โˆซโˆ’โˆžโˆžfX(x)dx=1\int_{-\infty}^{\infty} f_X(x) dx = 1
  • Relationship between CDF and PDF: FX(x)=โˆซโˆ’โˆžxfX(t)dtF_X(x) = \int_{-\infty}^{x} f_X(t) dt
  • Quantile function (inverse CDF): QX(p)=infโก{x:FX(x)โ‰ฅp}Q_X(p) = \inf\{x: F_X(x) \geq p\}, where 0<p<10 < p < 1

Probability Distributions

  • Bernoulli distribution: Models a single trial with two possible outcomes (success or failure)
    • PMF: pX(x)=px(1โˆ’p)1โˆ’xp_X(x) = p^x(1-p)^{1-x}, where xโˆˆ{0,1}x \in \{0, 1\} and 0<p<10 < p < 1
  • Binomial distribution: Models the number of successes in a fixed number of independent Bernoulli trials
    • PMF: pX(x)=(nx)px(1โˆ’p)nโˆ’xp_X(x) = \binom{n}{x}p^x(1-p)^{n-x}, where xโˆˆ{0,1,โ€ฆ,n}x \in \{0, 1, \ldots, n\}, 0<p<10 < p < 1, and (nx)=n!x!(nโˆ’x)!\binom{n}{x} = \frac{n!}{x!(n-x)!}
  • Poisson distribution: Models the number of events occurring in a fixed interval of time or space
    • PMF: pX(x)=eโˆ’ฮปฮปxx!p_X(x) = \frac{e^{-\lambda}\lambda^x}{x!}, where xโˆˆ{0,1,2,โ€ฆ}x \in \{0, 1, 2, \ldots\} and ฮป>0\lambda > 0
  • Exponential distribution: Models the time between events in a Poisson process
    • PDF: fX(x)=ฮปeโˆ’ฮปxf_X(x) = \lambda e^{-\lambda x}, where x>0x > 0 and ฮป>0\lambda > 0
  • Normal (Gaussian) distribution: Models many natural phenomena and is characterized by its bell-shaped curve
    • PDF: fX(x)=12ฯ€ฯƒ2eโˆ’(xโˆ’ฮผ)22ฯƒ2f_X(x) = \frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}, where xโˆˆRx \in \mathbb{R}, ฮผโˆˆR\mu \in \mathbb{R}, and ฯƒ>0\sigma > 0

Common Probability Distributions

  • Uniform distribution: Models a random variable with equally likely outcomes over a specified interval
    • Discrete uniform: pX(x)=1np_X(x) = \frac{1}{n}, where xโˆˆ{1,2,โ€ฆ,n}x \in \{1, 2, \ldots, n\}
    • Continuous uniform: fX(x)=1bโˆ’af_X(x) = \frac{1}{b-a}, where xโˆˆ[a,b]x \in [a, b]
  • Geometric distribution: Models the number of trials until the first success in a sequence of independent Bernoulli trials
    • PMF: pX(x)=(1โˆ’p)xโˆ’1pp_X(x) = (1-p)^{x-1}p, where xโˆˆ{1,2,โ€ฆ}x \in \{1, 2, \ldots\} and 0<p<10 < p < 1
  • Negative binomial distribution: Models the number of trials until a specified number of successes occur in a sequence of independent Bernoulli trials
    • PMF: pX(x)=(xโˆ’1rโˆ’1)pr(1โˆ’p)xโˆ’rp_X(x) = \binom{x-1}{r-1}p^r(1-p)^{x-r}, where xโˆˆ{r,r+1,โ€ฆ}x \in \{r, r+1, \ldots\}, 0<p<10 < p < 1, and rโˆˆ{1,2,โ€ฆ}r \in \{1, 2, \ldots\}
  • Gamma distribution: Generalizes the exponential distribution and models waiting times and lifetimes
    • PDF: fX(x)=1ฮ“(ฮฑ)ฮฒฮฑxฮฑโˆ’1eโˆ’xฮฒf_X(x) = \frac{1}{\Gamma(\alpha)\beta^\alpha}x^{\alpha-1}e^{-\frac{x}{\beta}}, where x>0x > 0, ฮฑ>0\alpha > 0, and ฮฒ>0\beta > 0
  • Beta distribution: Models probabilities, proportions, and percentages
    • PDF: fX(x)=1B(ฮฑ,ฮฒ)xฮฑโˆ’1(1โˆ’x)ฮฒโˆ’1f_X(x) = \frac{1}{B(\alpha, \beta)}x^{\alpha-1}(1-x)^{\beta-1}, where xโˆˆ(0,1)x \in (0, 1), ฮฑ>0\alpha > 0, and ฮฒ>0\beta > 0

Expectation and Variance

  • Expectation (mean) of a discrete random variable XX: E[X]=โˆ‘xxpX(x)E[X] = \sum_x xp_X(x)
  • Expectation (mean) of a continuous random variable XX: E[X]=โˆซโˆ’โˆžโˆžxfX(x)dxE[X] = \int_{-\infty}^{\infty} xf_X(x) dx
  • Linearity of expectation: E[aX+bY]=aE[X]+bE[Y]E[aX + bY] = aE[X] + bE[Y] for constants aa and bb and random variables XX and YY
  • Variance of a random variable XX: Var(X)=E[(Xโˆ’E[X])2]=E[X2]โˆ’(E[X])2Var(X) = E[(X - E[X])^2] = E[X^2] - (E[X])^2
  • Standard deviation: ฯƒX=Var(X)\sigma_X = \sqrt{Var(X)}
  • Covariance between random variables XX and YY: Cov(X,Y)=E[(Xโˆ’E[X])(Yโˆ’E[Y])]Cov(X, Y) = E[(X - E[X])(Y - E[Y])]
    • Properties: Cov(X,X)=Var(X)Cov(X, X) = Var(X) and Cov(aX+b,cY+d)=acCov(X,Y)Cov(aX + b, cY + d) = acCov(X, Y) for constants aa, bb, cc, and dd
  • Correlation coefficient between random variables XX and YY: ฯX,Y=Cov(X,Y)ฯƒXฯƒY\rho_{X,Y} = \frac{Cov(X, Y)}{\sigma_X\sigma_Y}
    • Properties: โˆ’1โ‰คฯX,Yโ‰ค1-1 \leq \rho_{X,Y} \leq 1, ฯX,Y=1\rho_{X,Y} = 1 for perfect positive linear relationship, and ฯX,Y=โˆ’1\rho_{X,Y} = -1 for perfect negative linear relationship

Multivariate Distributions

  • Joint probability mass function (PMF) for discrete random variables XX and YY: pX,Y(x,y)=P(X=x,Y=y)p_{X,Y}(x, y) = P(X = x, Y = y)
  • Joint probability density function (PDF) for continuous random variables XX and YY: fX,Y(x,y)f_{X,Y}(x, y)
    • Properties: fX,Y(x,y)โ‰ฅ0f_{X,Y}(x, y) \geq 0 and โˆซโˆ’โˆžโˆžโˆซโˆ’โˆžโˆžfX,Y(x,y)dxdy=1\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} f_{X,Y}(x, y) dx dy = 1
  • Marginal PMF: pX(x)=โˆ‘ypX,Y(x,y)p_X(x) = \sum_y p_{X,Y}(x, y) and pY(y)=โˆ‘xpX,Y(x,y)p_Y(y) = \sum_x p_{X,Y}(x, y)
  • Marginal PDF: fX(x)=โˆซโˆ’โˆžโˆžfX,Y(x,y)dyf_X(x) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) dy and fY(y)=โˆซโˆ’โˆžโˆžfX,Y(x,y)dxf_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) dx
  • Conditional PMF: pYโˆฃX(yโˆฃx)=pX,Y(x,y)pX(x)p_{Y|X}(y|x) = \frac{p_{X,Y}(x, y)}{p_X(x)}, where pX(x)>0p_X(x) > 0
  • Conditional PDF: fYโˆฃX(yโˆฃx)=fX,Y(x,y)fX(x)f_{Y|X}(y|x) = \frac{f_{X,Y}(x, y)}{f_X(x)}, where fX(x)>0f_X(x) > 0
  • Independence for discrete random variables: pX,Y(x,y)=pX(x)pY(y)p_{X,Y}(x, y) = p_X(x)p_Y(y) for all xx and yy
  • Independence for continuous random variables: fX,Y(x,y)=fX(x)fY(y)f_{X,Y}(x, y) = f_X(x)f_Y(y) for all xx and yy

Applications in Actuarial Science

  • Pricing insurance policies using probability distributions to model claim frequencies and severities
    • Poisson distribution for modeling claim counts
    • Exponential, gamma, and Pareto distributions for modeling claim sizes
  • Calculating risk measures such as Value-at-Risk (VaR) and Tail Value-at-Risk (TVaR) using quantiles of loss distributions
  • Estimating reserves for outstanding claims using stochastic models and simulation techniques
  • Assessing the solvency and capital requirements of insurance companies using probabilistic approaches
    • Solvency II and Swiss Solvency Test frameworks
  • Pricing and hedging financial derivatives using stochastic calculus and martingale methods
    • Black-Scholes model for pricing European options
    • Binomial and trinomial tree models for pricing American options
  • Modeling dependence between risks using copulas and multivariate distributions
    • Gaussian copula for modeling linear dependence
    • t-copula and Archimedean copulas (Clayton, Gumbel, Frank) for modeling non-linear dependence
  • Applying credibility theory to blend individual and collective risk information for experience rating
    • Bรผhlmann-Straub model and empirical Bayes estimators