This note covers autoregressive (AR) and moving average (MA) models, focusing on their statistical properties, stationarity, and invertibility.

It also covers general linear processes, moving average (MA) processes, autoregressive (AR) processes, and mixed ARMA models.


General stochastic linear processes

Let

  • : Time index
  • : Process value at time
  • : White noise error term at time t t t, i.i.d. with mean 0 and constant variance.

A general linear process is:

Assuming , the mean is , and autocovariance is:

It is convergent if .

  • Example:

    Suppose and .

    Then,

    This process is stationary, with autocovariance depending only on lag.

Moving average (MA) processes

An MA() process has finite nonzero -weights:

where:

  • : Order of the model. Number of error terms
  • : Time lag
  • : Model parameter for lag
  • : Lagged error; error at time

These models are called short memory models, since the errors doesn’t last long into the future. To illustrate:

![](assets/Pasted image 20250602022612.png)

This goes back to the idea of stationarity, where the dependence of previous observations “declines” over time, or in the case of MA models, actually disappear completely as you go into the future.

MA(1) Process

This is a model that depends only on one lag of error in the past.

PropertyExpression
Mean
Variance
Covariance
Autocorrelation,

ranges from to , at to respectively.

Simulations show positive yields jagged series, negative smoother series.

MA(2) Process

PropertyExpression
Variance
Covariance\begin{align} \gamma_1 &= (-\theta_1 + \theta_1 \theta_2) \sigma_e^2 \\ \gamma_2 &= -\theta_2 \sigma_e^2 \end{align}
Autocorrelation\begin{align} \rho_1 &= \frac{-\theta_1 + \theta_1 \theta_2}{1 + \theta_1^2 + \theta_2^2} \\ \rho_2 &= \frac{-\theta_2}{1 + \theta_1^2 + \theta_2^2} \\ \rho_k &= 0, k \geq 3 \end{align}

General MA()

PropertyExpression
Variance
Covariance
Autocorrelation\begin{align} \rho_1 &= \frac{-\theta_1 + \theta_1 \theta_2}{1 + \theta_1^2 + \theta_2^2} \\ \rho_2 &= \frac{-\theta_2}{1 + \theta_1^2 + \theta_2^2} \\ \rho_k &= 0, k \geq 3 \end{align}

![](assets/Pasted image 20250414091346.png)

See also

Autoregressive (AR) processes

An AR() process satisfies:

where:

  • : Order of the model. Number of recursions.
  • : Model parameter for lag
  • : Error at time

In contrast to the moving average model, in AR models each observation depends on all previous observation recursively.

AR(1) Process

For :

PropertyExpression
Variance
Autocovariance
Autocorrelation
  • is stationarity:

AR(2) Process

  • Stationarity requires roots of to exceed 1 in modulus, satisfied by:

  • Autocorrelation follows Yule-Walker equation:

  • Initial values:
  • Variance:

General AR()

Autocorrelation:

Variance: .

Autoregressive moving average (ARMA) process

An ARMA(,) model is:

ARMA(1,1) Model

For , ensures stationarity:

PropertyExpression
Variance
Autocovariance\begin{align} \gamma_1 &= \phi \gamma_0 - \theta \sigma_e^2 \\ \gamma_k &= \phi \gamma_{k-1}, \quad k \geq 2 \end{align}
Autocorrelation

General ARMA(,)

Stationarity requires AR roots to exceed 1 in modulus. Autocorrelation satisfies for .

Invertibility

An MA() process is invertible if it can be written as an infinite AR process, requiring roots of to exceed 1 in modulus.

An MA(1) is invertible, if for all of its parameters.

For further reading, see Invertability in Time Series Models.