This revised edition offers an approach to information theory that is
more general than the classical approach of Shannon. Classically,
information is defined for an alphabet of symbols or for a set of
mutually exclusive propositions (a partition of the probability space Ω)
with corresponding probabilities adding up to 1. The new definition is
given for an arbitrary cover of Ω, i.e. for a set of possibly
overlapping propositions. The generalized information concept is called
novelty and it is accompanied by two concepts derived from it,
designated as information and surprise, which describe "opposite"
versions of novelty, information being related more to classical
information theory and surprise being related more to the classical
concept of statistical significance. In the discussion of these three
concepts and their interrelations several properties or classes of
covers are defined, which turn out to be lattices. The book also
presents applications of these concepts, mostly in statistics and in
neuroscience.