R. S. GOVINDARAJU and ARAMACHANDRA RAO School of Civil Engineering
Purdue University West Lafayette, IN., USA Background and Motivation The
basic notion of artificial neural networks (ANNs), as we understand them
today, was perhaps first formalized by McCulloch and Pitts (1943) in
their model of an artificial neuron. Research in this field remained
somewhat dormant in the early years, perhaps because of the limited
capabilities of this method and because there was no clear indication of
its potential uses. However, interest in this area picked up momentum in
a dramatic fashion with the works of Hopfield (1982) and Rumelhart et
al. (1986). Not only did these studies place artificial neural networks
on a firmer mathematical footing, but also opened the dOOf to a host of
potential applications for this computational tool. Consequently, neural
network computing has progressed rapidly along all fronts: theoretical
development of different learning algorithms, computing capabilities,
and applications to diverse areas from neurophysiology to the stock
market. . Initial studies on artificial neural networks were prompted by
adesire to have computers mimic human learning. As a result, the jargon
associated with the technical literature on this subject is replete with
expressions such as excitation and inhibition of neurons, strength of
synaptic connections, learning rates, training, and network experience.
ANNs have also been referred to as neurocomputers by people who want to
preserve this analogy.