A general convergence theorem
for the gradient method is proved under hypotheses which are given below. It is then
shown that the usual steepest descent and modified steepest descent algorithms
converge under the same hypotheses. The modified steepest descent algorithm allows
for the possibility of variable stepsize.