Search is a broad machine learning method where solutions are generated
and tested. We focus on evolving computable functions with genetic
programming. The literature reveals the complexity of programs is small,
indicating a limitation of current methods. No Free Lunch is not valid
for machine learning as simpler functions are represented more
frequently which is also related to Occam's razor. We argue for Occam's
razor, not on grounds of simplicity but probability. The complexity of a
function depends on the primitives available. If the representation can
build new primitives, then the complexity is independent of the
primitives. We give bounds on these constants and argue these are the
tightest. We examine representation, genetic operators and fitness
functions. A representation which addresses a general problem is
fruitful as large instances can be solved by evolving solutions to small
instances. Different versions of a fitness function are compared which
take into account if a program was terminated. A crossover operator is
introduced which acts on modules and increases the probability of
generating correctly terminating programs.