ABSTRACT
A time series is defined as a series of discrete data points representing measure-
ments of some physical quantity over time. Following this definition, any signal
can be conceptualized as a time series (provided that the dependent variable
is time). However, since most real world signals are analog and continuous,
they have to be digitized or discretized to fit the definition of a time series.
To properly differentiate between the two, a continuous signal is denoted by
f(t) and the corresponding discretized time series is denoted by f [n], where
n is the sample number (n ∈ Z, where Z is the set of integers). The difference between the two is illustrated in Fig. 2.1, which shows the continuous signal
f(t) = sin(t) plotted as a function of time and the corresponding discretized
time series f [n] = sin(nT ), where T is the sampling interval, sampled at every
one-second interval (T = 1 second). Since all signals considered in this book
are time-dependent and digitized, the terms signal and time series will be
used interchangeably unless specified otherwise.