Concentration inequalities for functions of independent random variables
is an area of probability theory that has witnessed a great revolution
in the last few decades, and has applications in a wide variety of areas
such as machine learning, statistics, discrete mathematics, and
high-dimensional geometry. Roughly speaking, if a function of many
independent random variables does not depend too much on any of the
variables then it is concentrated in the sense that with high
probability, it is close to its expected value. This book offers a host
of inequalities to illustrate this rich theory in an accessible way by
covering the key developments and applications in the field.
The authors describe the interplay between the probabilistic structure
(independence) and a variety of tools ranging from functional
inequalities to transportation arguments to information theory.
Applications to the study of empirical processes, random projections,
random matrix theory, and threshold phenomena are also presented.
A self-contained introduction to concentration inequalities, it includes
a survey of concentration of sums of independent random variables,
variance bounds, the entropy method, and the transportation method. Deep
connections with isoperimetric problems are revealed whilst special
attention is paid to applications to the supremum of empirical
processes.
Written by leading experts in the field and containing extensive
exercise sections this book will be an invaluable resource for
researchers and graduate students in mathematics, theoretical computer
science, and engineering.