ABSTRACT

Many models have been proposed in the literature to describe some stylized facts of empirical financial time series. The autoregressive conditional heteroscedastic (ARCH) model of Engle (1982) is capable of modeling volatility clustering and large kurtosis. Several extensions of the model have also been developed to fit specific features of empirical series, e.g., the generalized ARCH (GARCH) and exponential GARCH models. See Bollerslev et al. (1992) for a survey. These models enjoy much success in studying the volatility evolution of a financial time series, but they are unable to explain the frequency and sizes of extreme jumps commonly occurring in practice. A natural way to deal with these big jumps is to treat them as outliers and handle them accordingly. The goal of this chapter is to provide a rigorous investigation in the modeling and detection of outliers in a GARCH process.