ABSTRACT

This chapter argues first of all that the main argument for Information Ethics relies on a questionable notion of impartiality. And second, the notion of an informational object is, it maintains, a rather elusive one. There are innumerable different ways of describing reality in information-theoretic terms, no one of which is better than any of the others. Which one we adopt is largely a matter of convention. The chapter also argues compellingly that the history of ethics involves a steady movement toward greater impartiality. It examines two candidate definitions of entropy. The first definition is drawn from Claude Shannon's theory of statistical information; the second is based on the theory of computational complexity. In each case, the chapter argues that while the given theory of information is sound, its application to ethics is dubious. The discussion will of necessity be somewhat technical. It progresses some of the same ideas in a less technical setting.