ABSTRACT

This chapter introduces various entropy and a-entropy based similarity measures such as Renyi entropy and divergence, mutual information and a-Jensen difference divergence. It describes continuous Euclidean functionals such as the minimal spanning tree and the k-nearest neighbor graph that asymptotic converge to the Renyi entropy. The chapter presents the entropic graph estimate of the Henze–Penrose affinity whereas the chapter presents the entropic graph estimates of a-GA divergence a-MI. It also describes the feature based matching techniques used in this work, different types of features used and the advantages of using such methods. The chapter discusses computational considerations involved in constructing graphs. It also presents the experiments we conducted to compare and contrast our methods with other registration algorithms. The chapter reviews entropy, relative entropy, and divergence as measures of dissimilarity between probability distributions. It provides insight into the formulation of these algorithms and the assumptions that lead to faster, lower complexity variants of these algorithms.