ABSTRACT

The last four chapters all utilized coalescence of Markov chains in order to achieve perfect samples. With these techniques you could either be read once or interruptible, but not both. The techniques of this and the next two chapters rely on different ideas that move beyond Markov chains. This enables the construction of linear time, read-once, and interruptible algorithms for sampling from high-noise Markov randomfields, along with algorithms for distributions on permutationswhere coalescence is difficult to achieve.