This is the first comprehensive book on information geometry, written by
the founder of the field. It begins with an elementary introduction to
dualistic geometry and proceeds to a wide range of applications,
covering information science, engineering, and neuroscience. It consists
of four parts, which on the whole can be read independently. A manifold
with a divergence function is first introduced, leading directly to
dualistic structure, the heart of information geometry. This part (Part
I) can be apprehended without any knowledge of differential geometry. An
intuitive explanation of modern differential geometry then follows in
Part II, although the book is for the most part understandable without
modern differential geometry. Information geometry of statistical
inference, including time series analysis and semiparametric estimation
(the Neyman-Scott problem), is demonstrated concisely in Part III.
Applications addressed in Part IV include hot current topics in machine
learning, signal processing, optimization, and neural networks. The book
is interdisciplinary, connecting mathematics, information sciences,
physics, and neurosciences, inviting readers to a new world of
information and geometry. This book is highly recommended to graduate
students and researchers who seek new mathematical methods and tools
useful in their own fields.