An important text that offers an in-depth guide to how information
theory sets the boundaries for data communication
In an accessible and practical style, Information and Communication
Theory explores the topic of information theory and includes concrete
tools that are appropriate for real-life communication systems. The text
investigates the connection between theoretical and practical
applications through a wide-variety of topics including an introduction
to the basics of probability theory, information, (lossless) source
coding, typical sequences as a central concept, channel coding,
continuous random variables, Gaussian channels, discrete input
continuous channels, and a brief look at rate distortion theory.
The author explains the fundamental theory together with typical
compression algorithms and how they are used in reality. He moves on to
review source coding and how much a source can be compressed, and also
explains algorithms such as the LZ family with applications to e.g. zip
or png. In addition to exploring the channel coding theorem, the book
includes illustrative examples of codes. This comprehensive text:
- Provides an adaptive version of Huffman coding that estimates source
distribution
- Contains a series of problems that enhance an understanding of
information presented in the text
- Covers a variety of topics including optimal source coding, channel
coding, modulation and much more
- Includes appendices that explore probability distributions and the
sampling theorem
Written for graduate and undergraduate students studying information
theory, as well as professional engineers, master's students,
Information and Communication Theory offers an introduction to how
information theory sets the boundaries for data communication.