Procedure

Conditional Least Squares minimizes the conditional sum of squares function to estimate model parameters, treating the model as a regression problem.

AR(1)

Step 1: Estimate

Set , yielding:

For large with stationary process:

Step 2: Estimate

Set with :

Comparison with

This is similar to sample autocorrelation , but the denominator is missing one term . For stationary processes with large , the difference is negligible.

AR(2)

Extend AR(1) approach:

For , solve the sample Yule-Walker equations:

AR(p)

Minimizing yields the same system as the sample Yule-Walker equations.

Connection

For stationary AR(p), conditional least squares and method of moments (Yule-Walker) produce nearly identical estimates for large samples.

MA(1)

For MA(1) model , use the invertible AR representation:

The conditional sum of squares becomes:

Nonlinear Optimization Required

is nonlinear in . No explicit solution exists. Use numerical methods.

Mixed Models (ARMA)

For ARMA(p,q), the conditional sum of squares involves both AR and MA components:

Model Fitting Procedure

  1. Set initial values:
  2. Compute errors recursively:
  3. Minimize numerically