ABSTRACT

This chapter presents three probabilistic graphical modeling paradigms in an increasing order of generality, and associated Bayesian inferencing techniques to compute posterior probabilities given some evidence. A graphical model has nodes representing some variables of a domain, and arcs between nodes representing probabilistic relationships among variables. A graphical model can be built in consultation with subject-matter experts. We start with Naïve Bayesian Classifier (NBCs), move to their generalizations k-dependence Naïve Bayesian Classifier (kNBCs) and, finally, explore the most general Bayesian Belief Networks (BNs). An NBC is a kNBC with k = 0, and a kNBC is also a BN with limited relationships among variables. Structures of NBCs and kNBCs are considerably simpler than BNs, and inferencing in them does not require complex evidence propagation algorithms. Given their foundations in Bayesian probability, these graphical models are perhaps most suitable as a stepping stone from traditional statistical analytics to the model-based AI paradigm for analytics, allowing human expertise to be incorporated easily into graphical models. Unless otherwise stated, variables are assumed to be categorical in this section.