ABSTRACT

The previous chapter introduced random number generation and presented some simple strategies for producing musical results that are more interesting than directly mapping distributions onto sound parameters. Without these controls probability distributions sound incoherent due to a lack of correlation between the generated samples. But in a random walk there is a great deal of correlation between successive events and the results can sound almost too coherent! A Markov process, or Markov chain, is a type of random process closely related to the random walk that is capable of producing different degrees of correlation between its random outcomes. In a Markov process past events represent a state, or context, for determining the probabilities of subsequent events. The number of past events considered by the process is called its order. In a first order process the probabilities for choosing the next event depend only on the immediately preceding event. In a second order Markov process the probabilities for the next choice depend on the last two events. A Markov process can reflect any number of past choices, including the degenerate case of no past choices. A zero order Markov Process is equivalent to weighted random selection.