ABSTRACT

Communication is a form of transporting (or reproducing) information from point A, where it originates, to point B. The two points (A and B) can be separated in space or time. Several authors have asked themselves: what is the nature of the information? It is related to the concepts of entropy, order and improbability. Claude Shannon proposed to encode the message through a series of binary questions in order to be able to measure the information contained in it by means of a dimensionless unit of measurement called a binary digit (bit). A “message” consisting of letters of the alphabet can be coded into “symbols” (Morse code), which can be transmitted through “signals” consisting of short electrical pulses (dots) or long electrical pulses (lines). Therefore, information uses symbolic systems in which “something represents something else”. The best-known symbolic systems are DNA and articulate language. They are formed by physical, replicable (transmissible) structures that originated in a historical process (which accounts for their arbitrariness). Organisms and machines that simulate organisms are systems with goals (goals located in the future). They consist of three major components: (a) an input, which encodes some aspects of the physical world into information; (b) a processing center; and (c) an output equipped with actuators. To these three components must be added an additional system capable of re-entering the data and information referring to the performance performed (negative feedback). In recent decades, exploration machines, machines with implicit intelligence (machine learning, deep learning) and machines with internal imaginative states have been developed. On an information technology and philosophical level, a lively discussion is underway on the possible negative effects that a non-beneficial artificial superintelligence could cause to humanity and to life in general.