🔀Stochastic Processes Unit 3 – Stochastic processes basics

Stochastic processes are collections of random variables indexed by time or space. They model unpredictable systems in fields like finance, physics, and biology. Key concepts include state spaces, sample paths, stationarity, and the Markov property. Types of stochastic processes include discrete-time and continuous-time processes, Markov chains, Gaussian processes, and point processes. Understanding probability theory foundations is crucial for analyzing these processes and solving related problems in various applications.

Key Concepts and Definitions

  • Stochastic process {X(t),tT}\{X(t), t \in T\} is a collection of random variables indexed by a parameter tt, often representing time
  • State space SS is the set of all possible values that the random variables X(t)X(t) can take
    • Can be discrete (finite or countably infinite) or continuous (uncountably infinite)
  • Sample path or realization is a single possible trajectory of the stochastic process over time
  • Stationarity implies that the joint probability distribution of the process does not change when shifted in time
    • Strictly stationary if the joint distribution is invariant under any time shift
    • Weakly stationary (or covariance stationary) if the mean and covariance remain constant over time
  • Ergodicity suggests that the time average of a single realization converges to the ensemble average as the time interval grows
  • Markov property states that the future state of the process depends only on the current state, not on the past states
    • P(X(tn)xnX(tn1)=xn1,,X(t1)=x1)=P(X(tn)xnX(tn1)=xn1)P(X(t_n) \leq x_n | X(t_{n-1}) = x_{n-1}, \ldots, X(t_1) = x_1) = P(X(t_n) \leq x_n | X(t_{n-1}) = x_{n-1})

Types of Stochastic Processes

  • Discrete-time processes have a countable index set TT, often representing equally spaced time points (e.g., T={0,1,2,}T = \{0, 1, 2, \ldots\})
    • Examples include random walks and discrete-time Markov chains
  • Continuous-time processes have an uncountable index set TT, typically representing a continuous time interval (e.g., T=[0,)T = [0, \infty))
    • Examples include Poisson processes and Brownian motion
  • Markov processes satisfy the Markov property, where the future state depends only on the current state
    • Can be discrete-time (Markov chains) or continuous-time (continuous-time Markov processes)
  • Gaussian processes have joint distributions that are multivariate normal
    • Characterized by a mean function and a covariance function
  • Renewal processes consist of a sequence of independent and identically distributed (i.i.d.) random variables representing inter-arrival times
  • Point processes describe the occurrence of events in time or space, with the most notable example being the Poisson process

Probability Theory Foundations

  • Probability space (Ω,F,P)(\Omega, \mathcal{F}, P) consists of a sample space Ω\Omega, a σ\sigma-algebra F\mathcal{F}, and a probability measure PP
  • Random variable XX is a measurable function from the sample space Ω\Omega to the real numbers R\mathbb{R}
  • Cumulative distribution function (CDF) FX(x)=P(Xx)F_X(x) = P(X \leq x) fully characterizes the distribution of a random variable XX
  • Probability mass function (PMF) for discrete random variables: pX(x)=P(X=x)p_X(x) = P(X = x)
  • Probability density function (PDF) for continuous random variables: fX(x)=dFX(x)dxf_X(x) = \frac{dF_X(x)}{dx}
  • Expected value E[X]=xxpX(x)E[X] = \sum_{x} x p_X(x) for discrete XX and E[X]=xfX(x)dxE[X] = \int_{-\infty}^{\infty} x f_X(x) dx for continuous XX
  • Variance Var(X)=E[(XE[X])2]\text{Var}(X) = E[(X - E[X])^2] measures the dispersion of a random variable around its mean
  • Covariance Cov(X,Y)=E[(XE[X])(YE[Y])]\text{Cov}(X, Y) = E[(X - E[X])(Y - E[Y])] quantifies the linear dependence between two random variables XX and YY

Markov Chains

  • Markov chain is a discrete-time stochastic process satisfying the Markov property
    • State space SS can be finite or countably infinite
  • Transition probability pij=P(Xn+1=jXn=i)p_{ij} = P(X_{n+1} = j | X_n = i) represents the probability of moving from state ii to state jj in one step
  • Transition probability matrix P=[pij]P = [p_{ij}] contains all one-step transition probabilities
    • Each row of PP sums to 1, as it represents a probability distribution
  • nn-step transition probability pij(n)=P(Xm+n=jXm=i)p_{ij}^{(n)} = P(X_{m+n} = j | X_m = i) gives the probability of moving from state ii to state jj in nn steps
  • Chapman-Kolmogorov equations relate nn-step transition probabilities to one-step transition probabilities: pij(m+n)=kSpik(m)pkj(n)p_{ij}^{(m+n)} = \sum_{k \in S} p_{ik}^{(m)} p_{kj}^{(n)}
  • Stationary distribution π\pi satisfies πP=π\pi P = \pi and represents the long-run proportion of time spent in each state
    • For irreducible and aperiodic Markov chains, the stationary distribution is unique and exists regardless of the initial state

Poisson Processes

  • Poisson process {N(t),t0}\{N(t), t \geq 0\} is a continuous-time counting process satisfying certain properties
    • N(t)N(t) represents the number of events that have occurred up to time tt
  • Increments N(t)N(s)N(t) - N(s) for t>st > s are independent and Poisson distributed with mean λ(ts)\lambda(t - s), where λ>0\lambda > 0 is the rate parameter
  • Inter-arrival times between consecutive events are independent and exponentially distributed with mean 1/λ1/\lambda
  • Poisson processes are memoryless, meaning that the waiting time until the next event does not depend on the time since the last event
  • Poisson processes have stationary and independent increments
    • Stationary increments: the distribution of N(t)N(s)N(t) - N(s) depends only on the time difference tst - s
    • Independent increments: for disjoint time intervals, the increments are independent random variables
  • Poisson processes are often used to model the occurrence of rare events, such as customer arrivals or machine failures

Brownian Motion

  • Brownian motion (or Wiener process) {B(t),t0}\{B(t), t \geq 0\} is a continuous-time stochastic process with certain properties
    • B(0)=0B(0) = 0 almost surely
    • Increments B(t)B(s)B(t) - B(s) for t>st > s are independent and normally distributed with mean 0 and variance tst - s
  • Sample paths of Brownian motion are continuous but nowhere differentiable
  • Brownian motion has stationary and independent increments
  • Brownian motion is a Gaussian process with mean function μ(t)=0\mu(t) = 0 and covariance function Cov(B(s),B(t))=min(s,t)\text{Cov}(B(s), B(t)) = \min(s, t)
  • Variations of Brownian motion include:
    • Drifted Brownian motion: X(t)=μt+σB(t)X(t) = \mu t + \sigma B(t), where μ\mu is the drift parameter and σ\sigma is the volatility parameter
    • Geometric Brownian motion: S(t)=S(0)exp(μt+σB(t))S(t) = S(0) \exp(\mu t + \sigma B(t)), often used to model stock prices
  • Brownian motion is the foundation for many continuous-time stochastic processes and is widely used in financial mathematics and physics

Applications and Examples

  • Queueing theory: Markov chains and Poisson processes are used to model and analyze queueing systems (e.g., customer service centers, manufacturing systems)
    • Example: M/M/1 queue with Poisson arrivals and exponentially distributed service times
  • Reliability theory: Stochastic processes are used to model the lifetime and failure behavior of systems and components
    • Example: exponential distribution for modeling the time to failure of a light bulb
  • Finance: Stochastic processes, particularly Brownian motion, are used to model asset prices, interest rates, and other financial variables
    • Example: Black-Scholes model for pricing European options using geometric Brownian motion
  • Biology: Stochastic processes are used to model population dynamics, genetic drift, and the spread of epidemics
    • Example: birth-death processes for modeling population growth and extinction
  • Physics: Brownian motion is used to describe the random motion of particles suspended in a fluid
    • Example: diffusion of molecules in a gas or liquid
  • Speech recognition: Hidden Markov models (HMMs) are used to model and recognize speech patterns
    • Example: using HMMs to identify phonemes and words in a spoken sentence

Problem-Solving Techniques

  • Identify the type of stochastic process based on the problem description and the properties of the process
  • Determine the state space (discrete or continuous) and the index set (discrete or continuous time)
  • For Markov chains:
    • Construct the transition probability matrix PP
    • Use the Chapman-Kolmogorov equations to find nn-step transition probabilities
    • Solve for the stationary distribution π\pi by setting up and solving the system of linear equations πP=π\pi P = \pi and iSπi=1\sum_{i \in S} \pi_i = 1
  • For Poisson processes:
    • Identify the rate parameter λ\lambda
    • Use the Poisson distribution to find probabilities of the number of events in a given time interval
    • Use the exponential distribution to find probabilities related to inter-arrival times
  • For Brownian motion:
    • Use the properties of normal distribution to find probabilities related to increments
    • Apply Itô's lemma for transformations of Brownian motion (e.g., geometric Brownian motion)
  • Simulate sample paths of stochastic processes using appropriate methods (e.g., inverse transform method for Poisson processes, Euler-Maruyama scheme for Brownian motion)
  • Use conditioning and the law of total probability to break down complex problems into simpler subproblems
  • Apply moment-generating functions or characteristic functions to derive properties of stochastic processes


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.