Multivariate Approach to Time Series Model Identification

Filed in Articles by on July 4, 2022

Multivariate Approach to Time Series Model Identification.

ABSTRACT

This work suggests an exact and systematic model identification approach which is entirely new and addresses most of the challenges of existing methods. We developed quadratic discriminant functions for various orders of autoregressive moving average (ARMA) models.

An Algorithm that is to be used alongside our functions was also developed. In achieving this, three hundred sets of time series data were simulated for the development of our functions.

Another twenty five sets of simulated time series data were used in testing out the classifiers which correctly classified twenty three out of the twenty five sets.

The two cases of misclassification merely imply that our Algorithm will require a second iteration to correctly identify the model in question. The Algorithm was also applied to some real life time series data and it correctly classified it in two iterations

INTRODUCTION

Model identification is a crucial part of Time Series model development. The main task of Time Series Modeling is to first examine the series at hand so as to  establish the theoretical model that generates the Series.

This task seems to be the most challenging and most ambiguous in Time Series Modeling. It has been approached from different perspectives over time. One of the most popular approach is the Box and Jenkins approach presented in Box and Jenkins (1976).

Their method involves going through some iterative steps before a final model is selected.

The initial step involves calculating the sample autocorrelation function (ACF) and partial autocorrelation function (PACF) of the series at various lags and comparing their behaviour with known behaviour of some theoretical model and the model that best approximates the sample behavior is tentatively selected.

There are two serious problems with their method. First is the fact that one will need to fit several models or do several adjustments to arrive at the final model.

This makes the method computationally expensive. Another serious problem is the inability of the method to accurately differentiate between some classes of models.

For example, it is not easy to determine the values of p and q in fitting ARMA (p, q) when p, q s 0 since both the ACF and PACF tail off. However, Tsay and Tiao (1984, 1985) addressed this problem by proposing the use of the extended autocorrelation (EACF) and smallest canonical correlation SCAN respectively.

Tsay and Tiao methods are mere extension of the Box and Jenkins approach as they involve comparing behaviour of sample EACF and smallest canonical correlation with the theoretical behaviour.

The difficulties in matching these behaviour is even more in these approach because the clear cut off in theoretical EACF table for example is hardly observed in sample EACF, Cryer and Chan (2008).

Far away from the Box and Jenkins approach, Akaike (1969) and lots of other scientists have done several works on various forms of information criteria.

Their approach is based on values calculated from residual of already fitted models. The statistic calculated from residual of these fitted models is perceived as information loss as a result of fitting the model.

The order of the model that minimizes the information loss is finally adopted. Model identification stage of time series modeling has suffered severe deficiency over time.

All the available methods are deficient in terms of accurately selecting the model fit. There is no well defined procedure that gives the exact model or at least with a known error margin before going ahead to fit the model.

The approach presented here is well spelt out rules that guides model development. Some of the model identification approaches especially the Box and Jenkins approach is more of art than science as it is highly judgmental.

REFERNCES

Akaike, H. (1969). fitting Autoregressive Models for Prediction. Annals Institute of Statistical Mathematics, 21: 243-247.
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control 19 (6): 716–723
Akaike, H. (1979). A Bayesian Extension of the Minimum AIC procedure of Autoregressive Model fitting. Biometrica, 66:237-242
Akaike, H. (1980).  Likelihood and the  Bayes procedure.   in Bernardo, J. M.; et al., Bayesian Statistics, Valencia: University Press, pp. 143–166
Anderson, D. R. (2008). Model Based Inference in the Life Sciences, Springer
Anderson, O.D.(1975). Distinguishing between simple Box and Jenkins models. International Journal of math. Edu. In Sc. and Tech, 6(4): 461-465
Bhansali, R.J. (1993). Order Selection for Time Series Models: A Review, in development in Time Series Analysis by Rao, T.S. Chapman and Hall, London.
Bickel, P.J. and Levina, E. (2004). some theory for Fisher’s Linear Discriminant Function “Naïve Bayes” and some alternatives when there are many more variables than observations. Bernoulli, 10(6): 989-1010
Box, G.E.P and Jenkins, G.M. (1976). Time Series Analysis: Forecasting and Control. Holden-Day
Box, G.E.P. and Jenkins, G.M.(1970). Time Series Analysis: Forecasting and Control. Holden-Day

Comments are closed.

Hey Hi

Don't miss this opportunity

Enter Your Details