Algorithms
Large-Scale Optimization
By default fminunc chooses the large-scale algorithm if you supplies the gradient in fun (and the GradObj option is set to 'on' using optimset). This algorithm is a subspace trust-region method and is based on the interior-reflective Newton method described in [2] and [3]. Each iteration involves the approximate solution of a large linear system using the method of preconditioned conjugate gradients (PCG). See Large Scale fminunc Algorithm, Trust-Region Methods for Nonlinear Minimization and Preconditioned Conjugate Gradient Method.
Medium-Scale Optimization
fminunc, with the LargeScale option set to 'off' with optimset, uses the BFGS Quasi-Newton method with a cubic line search procedure. This quasi-Newton method uses the BFGS ([1],[5],[8], and [9]) formula for updating the approximation of the Hessian matrix. You can select the DFP ([4],[6], and [7]) formula, which approximates the inverse Hessian matrix, by setting the HessUpdate option to 'dfp' (and the LargeScale option to 'off'). You can select a steepest descent method by setting HessUpdate to 'steepdesc' (and LargeScale to 'off'), although this is not recommended.
Partager