About This Book This book is about training methods - in particular,
fast second-order training methods - for multi-layer perceptrons (MLPs).
MLPs (also known as feed-forward neural networks) are the most
widely-used class of neural network. Over the past decade MLPs have
achieved increasing popularity among scientists, engineers and other
professionals as tools for tackling a wide variety of information
processing tasks. In common with all neural networks, MLPsare trained
(rather than programmed) to carryout the chosen information processing
function. Unfortunately, the (traditional' method for trainingMLPs- the
well-knownbackpropagation method - is notoriously slow and unreliable
when applied to many prac- tical tasks. The development of fast and
reliable training algorithms for MLPsis one of the most important areas
ofresearch within the entire field of neural computing. The main purpose
of this book is to bring to a wider audience a range of alternative
methods for training MLPs, methods which have proved orders of magnitude
faster than backpropagation when applied to many training tasks. The
book also addresses the well-known (local minima' problem, and explains
ways in which fast training methods can be com- bined with strategies
for avoiding (or escaping from) local minima. All the methods described
in this book have a strong theoretical foundation, drawing on such
diverse mathematical fields as classical optimisation theory, homotopic
theory and stochastic approximation theory.