ABSTRACT

By considering inductive inference of the viewpoint of a gradual inclusion of information, instead of forecasting a given sequence, it will be shown that conditional algorithmic complexity decreases during learning. Based on a theorem of Levin, conditional algorithmic complexity and mutual algorithmic complexity are shown to be approximated by conditional entropy and mutual information, respectively. Furthermore, physical randomness and physical complexity are shown to be given by conditional algorithmic complexity and mutual algorithmic complexity, hence sum up to algorithmic complexity. A relation between computation and measurement will be suggested.