This chapter provides a brief and informal introduction to Hidden Markov models (HMM), and to their many potential uses, and discusses fundamental in understanding the structure of such models. It discusses account of mixture distributions, because the marginal distribution of a hidden Markov model is a mixture distribution. The chapter introduces Markov chains, which provide the underlying 'parameter process' of a hidden Markov model. HMMs are models in which the distribution that generates an observation depends on the state of an underlying and unobserved Markov process. There is one aspect of mixtures of continuous distributions that differs from the discrete case and is worth highlighting. It is this: it can happen that, in the vicinity of certain parameter combinations, the likelihood is unbounded. An irreducible Markov chain has a unique, strictly positive, stationary distribution. The chapter also presents an overview of the key concepts discussed in this book.