Review Questions for Optimization
-  What are the necessary and sufficient conditions for a point to be a local minimum in one dimension?
 
-  What are the necessary and sufficient conditions for a point to be a local minimum in \(n\) dimensions?
 
-  How do you classify extrema as minima, maxima, or saddle points?
 
-  What is the difference between a local and a global minimum?
 
-  What does it mean for a function to be unimodal?
 
-  What special attribute does a function need to have for golden section search to find a minimum?
 
-  Run one iteration of golden section search.
 
-  Calculate the gradient of a function (function has many inputs, one output).
 
-  Calculate the Jacobian of a function (function has many inputs, many outputs).
 
-  Calculate the Hessian of a function.
 
-  Find the search direction in steepest/gradient descent.
 
-  Why must you perform a line search each step of gradient descent?
 
-  Run one step of Newton's method in one dimension.
 
-  Run one step of Newton's method in \(n\) dimensions.
 
-  When does Newton's method fail to converge to a minimum?
 
-  What operations do you need to perform each iteration of golden section search?
 
-  What operations do you need to perform each iteration of Newton's method in one dimension?
 
-  What operations do you need to perform each iteration of Newton's method in \(n\) dimensions?
 
-  What is the convergence rate of Newton's method?