💹Financial Mathematics Unit 3 – Probability Theory & Random Variables

Probability theory and random variables form the foundation of financial mathematics. These concepts help us understand uncertainty and risk in financial markets, from basic coin flips to complex portfolio management. Key ideas include probability measures, sample spaces, and events. We explore probability rules, random variables, and distributions. Expected values, variance, and covariance are crucial for analyzing financial data and making informed decisions.

Key Concepts and Definitions

  • Probability measures the likelihood of an event occurring and ranges from 0 (impossible) to 1 (certain)
  • Sample space (Ω\Omega) includes all possible outcomes of an experiment or random process
  • Event (A) is a subset of the sample space containing one or more outcomes of interest
  • Mutually exclusive events cannot occur simultaneously in a single trial (rolling a 3 and a 4 on a die)
  • Independent events do not influence each other's probability (flipping a coin and rolling a die)
    • P(AB)=P(A)P(B)P(A \cap B) = P(A) \cdot P(B) for independent events A and B
  • Conditional probability P(AB)P(A|B) measures the probability of event A given that event B has occurred
    • Calculated using the formula P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}

Probability Basics

  • Addition rule for mutually exclusive events: P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B)
  • Addition rule for non-mutually exclusive events: P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
  • Multiplication rule for independent events: P(AB)=P(A)P(B)P(A \cap B) = P(A) \cdot P(B)
  • Multiplication rule for dependent events: P(AB)=P(A)P(BA)P(A \cap B) = P(A) \cdot P(B|A)
  • Complementary events have probabilities that sum to 1 (rolling an even number and rolling an odd number on a die)
    • P(A)+P(Ac)=1P(A) + P(A^c) = 1, where AcA^c is the complement of event A
  • Bayes' theorem relates conditional probabilities: P(AB)=P(BA)P(A)P(B)P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}

Random Variables

  • Random variable (X) assigns a numerical value to each outcome in a sample space
  • Discrete random variables have countable outcomes (number of heads in 5 coin flips)
  • Continuous random variables have uncountable outcomes within an interval (time until a stock reaches a certain price)
  • Probability mass function (PMF) gives the probability of each value for a discrete random variable
    • p(x)=P(X=x)p(x) = P(X = x), where x is a possible value of X
  • Probability density function (PDF) describes the probability distribution for a continuous random variable
    • f(x)f(x) is the PDF, and P(aXb)=abf(x)dxP(a \leq X \leq b) = \int_a^b f(x) dx

Probability Distributions

  • Bernoulli distribution models a single trial with two possible outcomes (success or failure)
    • P(X=1)=pP(X = 1) = p and P(X=0)=1pP(X = 0) = 1 - p, where p is the probability of success
  • Binomial distribution models the number of successes in a fixed number of independent Bernoulli trials
    • P(X=k)=(nk)pk(1p)nkP(X = k) = \binom{n}{k} p^k (1-p)^{n-k}, where n is the number of trials and k is the number of successes
  • Poisson distribution models the number of events occurring in a fixed interval of time or space
    • P(X=k)=eλλkk!P(X = k) = \frac{e^{-\lambda} \lambda^k}{k!}, where λ\lambda is the average rate of events
  • Normal (Gaussian) distribution is a continuous distribution with a bell-shaped curve
    • Characterized by its mean (μ\mu) and standard deviation (σ\sigma)
    • Standard normal distribution has μ=0\mu = 0 and σ=1\sigma = 1

Expectation and Variance

  • Expected value (mean) of a random variable is the average value over many trials
    • For discrete X: E(X)=xxp(x)E(X) = \sum_{x} x \cdot p(x)
    • For continuous X: E(X)=xf(x)dxE(X) = \int_{-\infty}^{\infty} x \cdot f(x) dx
  • Variance measures the average squared deviation from the mean
    • Var(X)=E[(Xμ)2]=E(X2)[E(X)]2Var(X) = E[(X - \mu)^2] = E(X^2) - [E(X)]^2
  • Standard deviation is the square root of the variance: σ=Var(X)\sigma = \sqrt{Var(X)}
  • Linearity of expectation: E(aX+bY)=aE(X)+bE(Y)E(aX + bY) = aE(X) + bE(Y) for constants a and b
  • Covariance measures the linear relationship between two random variables X and Y
    • Cov(X,Y)=E[(XE(X))(YE(Y))]Cov(X, Y) = E[(X - E(X))(Y - E(Y))]

Applications in Finance

  • Portfolio return is a weighted average of individual asset returns
    • Rp=i=1nwiRiR_p = \sum_{i=1}^n w_i R_i, where wiw_i is the weight of asset i and RiR_i is its return
  • Portfolio variance depends on asset variances and covariances
    • σp2=i=1nwi2σi2+2i=1nj=i+1nwiwjσiσjρij\sigma_p^2 = \sum_{i=1}^n w_i^2 \sigma_i^2 + 2 \sum_{i=1}^n \sum_{j=i+1}^n w_i w_j \sigma_i \sigma_j \rho_{ij}
    • ρij\rho_{ij} is the correlation coefficient between assets i and j
  • Value at Risk (VaR) estimates the potential loss for an investment over a given time horizon and confidence level
    • For a normal distribution: VaR=μ+zασVaR = \mu + z_{\alpha} \sigma, where zαz_{\alpha} is the z-score for the desired confidence level
  • Option pricing models (Black-Scholes) use probability distributions to estimate the fair price of options contracts

Common Probability Problems

  • Gambler's Ruin problem analyzes the probability of a gambler going bankrupt
    • Depends on initial capital, win probability, and bet size
  • Birthday Problem calculates the probability of two people sharing a birthday in a group
    • Surprisingly high for relatively small groups (50% for 23 people)
  • Monty Hall problem demonstrates the importance of conditional probability
    • Switching doors after the host reveals a goat increases the win probability to 2/3
  • Coupon Collector's problem determines the expected number of trials to collect all unique items
    • E(X)=n(1n+1n1+...+12+1)E(X) = n(\frac{1}{n} + \frac{1}{n-1} + ... + \frac{1}{2} + 1), where n is the number of unique items

Key Formulas and Techniques

  • Combinatorics for counting possibilities
    • Permutations: nPr=n!(nr)!nPr = \frac{n!}{(n-r)!}
    • Combinations: nCr=(nr)=n!r!(nr)!nCr = \binom{n}{r} = \frac{n!}{r!(n-r)!}
  • Moment generating functions (MGFs) uniquely characterize probability distributions
    • MX(t)=E(etX)=xetxp(x)M_X(t) = E(e^{tX}) = \sum_{x} e^{tx} \cdot p(x) for discrete X
    • MX(t)=E(etX)=etxf(x)dxM_X(t) = E(e^{tX}) = \int_{-\infty}^{\infty} e^{tx} \cdot f(x) dx for continuous X
  • Law of Large Numbers states that the sample mean approaches the population mean as the sample size increases
    • limnP(Xˉnμ<ϵ)=1\lim_{n \to \infty} P(|\bar{X}_n - \mu| < \epsilon) = 1 for any ϵ>0\epsilon > 0
  • Central Limit Theorem asserts that the sum of many independent random variables is approximately normally distributed
    • Useful for approximating probabilities for large samples
  • Markov Chains model systems transitioning between states with fixed probabilities
    • Transition matrix P contains the probabilities of moving from one state to another


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.