Ever since the advent of the intelligence test we have thought of
exceptional achievement in terms of cognitive attributes. We have words
and phrases like "genius," "above average intelligence," "average" and
"mentally deficient" to describe different levels of cognitive ability.
In the United States widespread use of intelligence tests followed the
success of the in World War I, and for the next half-century Army Alpha
and Beta Tests intelligence tests were the major measures used to
predict school and vocational achievement. Learning was primarily
studied in laboratories, and the behaviorist theories that were dominant
largely dealt with changes in overt behavior. As a result there was
relatively little influence of learning research on concepts involving
cognition and intelligence. The transition from behaviorism to cognitive
psychology that began in the 1940's and 50's came into full flower in
the 1970's and 80's, and great progress was made in understanding
learning, memory, and thinking. In the decades following World War I
there had been many debates about the possible influence of
environmental conditions on intelligence, but the cognitive abilities
measured by intelligence tests were generally believed to be determined
by heredity. The intelligence tests of cognitive abilities correlated
substantially with academic performance; so their use in determining
which students needed special help in school or which students were
capable of university work was widely accepted. As cognitive psychology
became dominant, it became apparent that although heredity was
important, intelligence consisted of learnable abilities.