Procedure

Unconditional Least Squares minimizes the unconditional sum of squares function as a compromise between conditional least squares and full maximum likelihood.

Procedure

Step 1: Set Up the Function

Step 2: Minimize Numerically

Set derivatives to zero:

No Closed-Form Solution

The term makes these equations nonlinear in and . Numerical optimization is required.

Step 3: Use Numerical Methods

Apply iterative algorithms:

  • Newton-Raphson
  • Gradient descent
  • Gauss-Newton

Advantages

  • Uses all observations (unlike conditional LS)
  • Less computational burden than full MLE
  • Better for short series or seasonal models