scoresvideos
Intro to Time Series
Table of Contents

ARMA models blend autoregressive and moving average components to capture time series patterns. They assume stationarity and use lagged observations and error terms to predict future values. Understanding ARMA models is key to grasping more complex time series analysis techniques.

Estimating ARMA parameters involves methods like least squares and maximum likelihood. Assessing stationarity and invertibility is crucial for model validity. ARMA models find applications in forecasting and understanding temporal relationships in various fields, from finance to environmental science.

Introduction to Mixed ARMA Models

Characteristics of ARMA models

  • ARMA models combine autoregressive (AR) and moving average (MA) components to capture both short-term and long-term dependencies in stationary time series data
    • AR component models the relationship between an observation and a specified number of lagged observations (past values of the series)
    • MA component represents the error term as a linear combination of error terms occurring at various times in the past (past forecast errors)
  • ARMA models assume the time series is stationary, meaning its statistical properties (mean, variance, autocovariance) remain constant over time
  • The general form of an ARMA(p, q) model is expressed as:
    • $y_t = c + \phi_1 y_{t-1} + \phi_2 y_{t-2} + ... + \phi_p y_{t-p} + \varepsilon_t + \theta_1 \varepsilon_{t-1} + \theta_2 \varepsilon_{t-2} + ... + \theta_q \varepsilon_{t-q}$
      • $y_t$ represents the time series value at time t
      • $c$ denotes a constant term
      • $\phi_1, \phi_2, ..., \phi_p$ are the autoregressive coefficients
      • $\varepsilon_t, \varepsilon_{t-1}, ..., \varepsilon_{t-q}$ represent the white noise error terms
      • $\theta_1, \theta_2, ..., \theta_q$ are the moving average coefficients

Orders in ARMA models

  • The order of an ARMA model, denoted as ARMA(p, q), specifies the number of lagged terms and error terms included in the model equation
    • p represents the order of the autoregressive component (number of lagged observations)
    • q represents the order of the moving average component (number of lagged error terms)
  • For instance, an ARMA(1, 1) model has the form:
    • $y_t = c + \phi_1 y_{t-1} + \varepsilon_t + \theta_1 \varepsilon_{t-1}$
    • This model includes one lagged observation ($y_{t-1}$) and one lagged error term ($\varepsilon_{t-1}$)
  • The appropriate orders (p and q) are determined by examining the autocorrelation function (ACF) and partial autocorrelation function (PACF) of the time series
    • ACF measures the correlation between observations at different lags (time intervals)
    • PACF measures the correlation between observations at different lags while controlling for the effects of intermediate lags

Estimation and Assessment of ARMA Models

Parameter estimation for ARMA

  • Least squares estimation finds the parameter values that minimize the sum of squared residuals (differences between observed and predicted values)
    • The objective is to minimize the sum of squared residuals to obtain the best-fitting parameters
  • Maximum likelihood estimation determines the parameter values that maximize the likelihood function (probability of observing the given data under the assumed model)
    • The goal is to find the parameters that make the observed data most likely
  • Both methods estimate the autoregressive coefficients ($\phi_1, \phi_2, ..., \phi_p$), moving average coefficients ($\theta_1, \theta_2, ..., \theta_q$), and the constant term ($c$) to fit the ARMA model to the data

Stationarity and invertibility assessment

  • Stationarity is a fundamental assumption for ARMA models, requiring constant mean, variance, and autocovariance over time
    • For an ARMA model to be stationary, the roots of the characteristic equation of the AR component must lie outside the unit circle
      • Characteristic equation: $1 - \phi_1 z - \phi_2 z^2 - ... - \phi_p z^p = 0$
      • If any root has an absolute value ≤ 1, the model is non-stationary
  • Invertibility ensures the uniqueness of the MA representation
    • An ARMA model is invertible if the roots of the characteristic equation of the MA component lie outside the unit circle
      • Characteristic equation: $1 + \theta_1 z + \theta_2 z^2 + ... + \theta_q z^q = 0$
      • If any root has an absolute value ≤ 1, the model is non-invertible
  • Non-stationary or non-invertible models may produce unreliable forecasts and inference

Applications of ARMA models

  1. Identify the appropriate orders (p and q) for the ARMA model based on the ACF and PACF of the time series

  2. Estimate the model parameters using least squares or maximum likelihood estimation

  3. Assess the model's goodness of fit using diagnostic tools (residual analysis, information criteria like AIC and BIC)

    • Residual analysis checks if residuals are uncorrelated and normally distributed
    • Information criteria balance model fit and complexity, preferring simpler models
  4. Interpret the estimated coefficients in the context of the problem domain

    • Autoregressive coefficients indicate the persistence of past observations on the current value
    • Moving average coefficients represent the impact of past shocks on the current value
  5. Use the fitted ARMA model to generate forecasts and evaluate their accuracy by comparing with actual values

  6. Consider the limitations and assumptions of ARMA models (linearity, stationarity) when interpreting the results

  7. Apply insights from the ARMA model to inform decision-making and problem-solving in the specific domain