The life and works of Claude Shannon are reviewed from a perspective broader than usual. In particular it is explored the idea that, in the long term, Shannon's major significance will not be due to the aspects most commonly praised, but to the fact that he was a pioneer on the way to a new conception of Science. The paper focuses on his mathematical theory of communication and the criticisms it has received from different fields. From the new framework of analysis its greatest contribution becomes the introduction of information-theoretic entropy as a concept belonging to probability theory. It is discussed the inadequacy of referring to Shannon's theory as "the theory of transmission of information" or as "information theory", as it is customary. It is proposed instead that the term "information theory" be reserved for a theory of knowledge. More specifically, the true information revolution is identified as the formulation of Science as a probabilistic logical inference theory founded on semantics, using plausible logic languages with observations.
展开▼