8.3 Cointegration
Although variables that follow each other, may in fact be cointegrated, meaning that they are not just following each other by coincidence, but they actually have a Orelationship.
To check if both series (y and x) are nonstationary and the regression between these are non stationary. Then we can predict based on this relationship.
One can also say, that series are cointegrated if they move with each other, e.g., have the same trend. See the following link with an example: https://www.wallstreetmojo.com/cointegration/.
Rule of thumb, when the difference between variables is the same,
then they are cointegrated
This implies, that we don’t have to be differencing as the relationship between the variables are stationary, implying:
\[\begin{equation} y_t-\beta x_t=\epsilon_{t\ } \tag{8.1} \end{equation}\]
show stationarity. The intuition is that y less x times a constant value, you will be left with the error terms, that show stationarity, as subtracting the series’ with each other, will take out the unit roots.
8.3.1 Checking for cointegration
The process:
There are basically two options for testing for this. The second option is the better process
Option 1
- Check if the series have unit roots (are nonstationary)
- Their linear combination \(y_t-\beta x_t\), does not have a unit root (= is stationary)
Option 2
This can be done with an Engle-Granger (EG) test. The procedure sets forth the following:
- Testing if \(y_t\) and \(x_t\) are non stationary using an ADF test.
- Regress \(y_t\) ib \(x_t\) and test if the residuals (\(e_t\)) show stationarity, that can be done with and ADF (augmented Dickey-Fuller test, but also a good idea to plot)
See an example of this procedure in the exercise with Dairy Data, see section 8.4.1.2