Revision with unchanged content. Modern microprocessors make use of
speculation, or predictions about future program behavior, to optimize
the execution of programs. Perceptrons are simple neural networks that
can be highly useful in speculation for their ability to examine larger
quantities of available data than more commonly used approaches, and
identify which data lead to accurate results. This work first studies
how perceptrons can be made to predict accurately when they directly
replace the traditional pattern table predictor. Different training
me-thods, perceptron topologies, and interference reduction strategies
are evaluated. Perceptrons are then applied to two speculative
applications: data value prediction and dataflow critical path
prediction. Several novel perce-ptron-based prediction strategies are
proposed for each application that can take advantage of a wider scope
of past data in making predictions than previous predictors could. These
predictors are evaluated against local table-based approaches on a
custom cycle-accurate processor simulator, and are shown on average to
have both superior accuracy and higher instruction-per-cycle
performance. This work is addressed to computer architects and com-puter
engineering researchers.