This book contains the lecture notes for a DMV course presented by the
authors at Gunzburg, Germany, in September, 1990. In the course we
sketched the theory of information bounds for non parametric and
semiparametric models, and developed the theory of non parametric
maximum likelihood estimation in several particular inverse problems:
interval censoring and deconvolution models. Part I, based on Jon
Wellner's lectures, gives a brief sketch of information lower bound
theory: Hajek's convolution theorem and extensions, useful minimax
bounds for parametric problems due to Ibragimov and Has'minskii, and a
recent result characterizing differentiable functionals due to van der
Vaart (1991). The differentiability theorem is illustrated with the
examples of interval censoring and deconvolution (which are pursued from
the estimation perspective in part II). The differentiability theorem
gives a way of clearly distinguishing situations in which 1 2 the
parameter of interest can be estimated at rate n / and situations in
which this is not the case. However it says nothing about which rates to
expect when the functional is not differentiable. Even the casual reader
will notice that several models are introduced, but not pursued in any
detail; many problems remain. Part II, based on Piet Groeneboom's
lectures, focuses on non parametric maximum likelihood estimates
(NPMLE's) for certain inverse problems. The first chapter deals with the
interval censoring problem.