# Random walk as AR process

In the previous post I have mentioned that in our review we have also presented a novel result, which we have analyzed ARFIMA process. Understanding ARFIMA process requires some specialized knowledge, which we will cover in this and the next few posts.

In this post we will take a well-known physical model, random walk, and try to understand it in the context of economical model, AR(p) process.

## White noise

A key component to the random walk is white noise. It is a type of noise, which is simply a collection of independent random values (samples from some given distribution). As these values do not depend on the past samples or influence future samples, the white noise series is said to be delta-correlated:

\begin{equation} \mathrm{ACF}(\tau) = \left\langle \xi(t_1) \xi(t_1 + \tau) \right\rangle = \delta(\tau) . \end{equation}

In the above ACF stands for auto-correlation function and \( \tau \) is the lag. As you can explore in the app below, no matter what distribution is used ACF is close to zero for all lags except \( \tau = 0 \).

Note that earlier we have examined spectral density of the white noise process (see this post). It is completely flat, which indicates that there are no periodic patterns in the white noise series.

## Random walk

Mathematically random walk is defined as a sum (or integral) of white noise:

\begin{equation} x_t = \sum_{i=1}^{t} \xi_i . \end{equation}

In the above \( \xi_i \) is discrete-time sample of white noise process. This equation can be also rewritten in recursive form:

\begin{equation} x_t = x_{t-1} + \xi_t , \quad \text{given} \quad x_0 = 0 . \end{equation}

This recursive form explicitly states that the next sample, \( x_t \), from the random walk process depends on the previous sample, \( x_{t-1} \). Hence random walk will be auto-correlated.

Strangely, the ACF decays extremely slow, though we have introduced dependence only on one previous sample from the process. This, quite likely, indicates that ACF should not be used to analyze non-stationary series. And Random walk is, in fact, a non-stationary process.

Trivial way to transform non-stationary series into stationary series is to apply differencing procedure prior to calculating ACF. For random walk it is sufficient to apply this procedure once:

\begin{equation} y_t = x_{t} - x_{t-1} . \end{equation}

ACF of \( y_t \), as expected, is exactly the same as ACF of white noise.

## Why random walk is AR process?

AR is an abbreviation of term auto-regressive. "Auto" implies "self", while "regression" is a dependency on some kind of predictor. So auto-regressive process needs to be able to predict itself. The only way do this is for the new sample, \( x_t \), to be dependent on past samples, different \( x_{t-i} \). In general:

\begin{equation} x_t = \sum_{i=1}^{p} \alpha_i x_{t-i} + \xi_t . \end{equation}

Now try setting \( p=1 \) and \( \alpha_1=1 \). Then compare the result against the recursive form of the random walk process. Can you see it now?