ABSTRACT

Whenever we deal with Complex Systems, we need to collect, handle, and process Big Data. Therefore, for trying to win the complexity challenges, we need to contrive smart methods to cope with the fast stream of data we store, their variety, variability, and complexity. There are two main strategies to succeed. One consists of improving current electronic computers, and the other is the interdisciplinary research line of Natural Computing. Electronic computers are based on the von Neumann architecture, and, so far, the pace of their improvement has been described by Moore’s law. Nowadays, computer companies are contriving computing technologies that try to go beyond Moore’s law. On the other hand, researchers working on Natural Computing draw inspirations from nature to propose new algorithms; new materials and architectures to compute and store information; new methodologies and models to interpret Complex Systems. Natural Computing is based on the rationale that every natural transformation is a kind of computation because information can be encoded through the states of natural systems. Natural processes can be grouped into two sets: those that involve living beings and are information-driven, and those that include only inanimate matter and are driven by force fields.