AR modeling

Of AR, MA, and ARMA, AR has the easiest model parameters to estimate because approximate maximum likelihood estimates can be obtained in closed form. MA and ARMA models generally require an iterative approach. Thus, it makes sense to create separate methods instead of just using ARMA functions with $Q=0$. Besides efficiency, another benefit of AR modeling is that the spectral information can be boiled down to but a few coefficients which can hold spectral information with high resolution. Many natural processes such as human speech can be well modeled as an all-pole process. As a result, we spend the most time with AR models. AR models include linear predictive coding (LPC), autoregressive (AR) modeling, and reflection coefficients (RC). Another way to represent the AR model is using the roots of the AR polynomial. The first $P+1$ ACF lags ( $\tau=0,1 \ldots P$) are required for a $P$-th order AR model [31]. These ACF lags can then be transformed to RCs or AR coefficients using invertible transformations, thus they are equivalent from a modeling point of view. Thus, there are four equivalent spectral representations of an autoregressive process : AR, RC, ACF, and roots. A good source of information on the topic is the book by Kay [31].

Because AR features are so important in time-series analysis, they deserve a detailed introduction. Let ${\bf x}= \{x_t, \; \; t=1,2 \ldots N\}$, be an autoregressive process (AR) of order $P$. This means that ${\bf x}$ is obtained from the recursion

$\displaystyle x_t = -\sum_{i=1}^P \; a_i \;x_{t-i} + \epsilon_t,$ (10.39)

where ${\bf a}= \{ 1, a_1, a_2 \ldots a_P \}$ are the AR coefficients, and $\{\epsilon_t, \; \; t=1,2 \ldots N\}$ are independent, identically distributed Gaussian samples of mean zero and variance $\sigma^2$, known as the innovation process. Because such a linear expansion is called a regression in the statistics literature, and ${\bf x}$ is regressed on itself, it is known as an autoregressive process. Note that $a_0$ is by definition unity so it carries no information and we often leave it out of the discussion. An AR process may also be viewed in terms of an infinite impulse-response (IIR) filter operating on iid samples of Gaussian noise. From systems theory, such a process may be represented by the linear system

$\displaystyle E(z) \rightarrow \left[ \frac{1}{A(z)} \right] \rightarrow X(z).
$

In other words, ${\bf x}$ is a process with Z-transform $X(z)$ produced by passing the innovations process $E(z)$ through a linear filter with Z-transform $\frac{1}{A(z)}$ where

$\displaystyle A(z) = 1 + a_1 z^{-1} + a_2 z^{-2} \cdots + a_P z^{-P}
$

The power spectrum of the AR process is (See eq. 10.8)

$\displaystyle P_x(\omega) = \sigma^2 \frac{1}{\vert A(e^{i\omega})\vert^2}.$ (10.40)

The corresponding length-$N$ circularly-stationary process has circular power spectrum (See eq. 10.9)

$\displaystyle \rho_k = \sigma^2 \frac{1}{\vert A_k\vert^2},$ (10.41)



Subsections