ABSTRACT

An architecture able to model temporal sequences of input-output data is presented. The architecture, called Markov Mixture of Experts (MME), combines a set of static models (called experts) by means of a Markov chain. To each state of the Markov chain, a unique distinct expert is assigned. Each output is produced by the expert corresponding to the current state of the Markov chain. The transitions between states, which correspond to switching between the various experts, depend probabilistically on both the current state and on the input variables. The architecture is an extension of both the Mixture of Experts architecture and the Hidden Markov Models.