Humanity's most basic intellectual quest to decipher nature and master
it has led to numerous efforts to build machines that simulate the world
or communi- cate with it [Bus70, Tur36, MP43, Sha48, vN56, Sha41,
Rub89, NK91, Nyc92]. The computational power and dynamic behavior of
such machines is a central question for mathematicians, computer
scientists, and occasionally, physicists. Our interest is in computers
called artificial neural networks. In their most general framework,
neural networks consist of assemblies of simple processors, or
"neurons," each of which computes a scalar activation function of its
input. This activation function is nonlinear, and is typically a
monotonic function with bounded range, much like neural responses to
input stimuli. The scalar value produced by a neuron affects other
neurons, which then calculate a new scalar value of their own. This
describes the dynamical behavior of parallel updates. Some of the
signals originate from outside the network and act as inputs to the
system, while other signals are communicated back to the environment and
are thus used to encode the end result of the computation.