Main

comments powered by Disqus

## Quasi Newton Methods in Optimization

#### Quasi-Newton Approximations

The following exercise demonstrates the use of Quasi-Newton methods, Newton's methods, and a Steepest Descent approach to unconstrained optimization. The following tutorial covers:

- Newton's method (exact 2nd derivatives)
- BFGS-Update method (approximate 2nd derivatives)
- Conjugate gradient method
- Steepest descent method

Chapter 3 covers each of these methods and the theoretical background for each. The following exercise is a practical implementation of each method with simplified example code for instructional purposes. The examples do not perform line searching which will be covered in more detail later.

#### MATLAB Source Code

#### Python Source Code

This assignment can be completed in groups of two. Additional guidelines on individual, collaborative, and group assignments are provided under the Expectations link.

comments powered by Disqus