Special tools are required for examining and solving optimization
problems. The main tools in the study of local optimization are
classical calculus and its modern generalizions which form nonsmooth
analysis. The gradient and various kinds of generalized derivatives
allow us to ac- complish a local approximation of a given function in a
neighbourhood of a given point. This kind of approximation is very
useful in the study of local extrema. However, local approximation alone
cannot help to solve many problems of global optimization, so there is a
clear need to develop special global tools for solving these problems.
The simplest and most well-known area of global and simultaneously local
optimization is convex programming. The fundamental tool in the study of
convex optimization problems is the subgradient, which actu- ally plays
both a local and global role. First, a subgradient of a convex function
f at a point x carries out a local approximation of f in a neigh-
bourhood of x. Second, the subgradient permits the construction of an
affine function, which does not exceed f over the entire space and
coincides with f at x. This affine function h is called a support func-
tion. Since f(y) h(y) for ally, the second role is global. In contrast
to a local approximation, the function h will be called a global affine
support.