Skip to content
/ filterSqp Public

(Incomplete) Implementation of the FilterSQP constained optimization algorithm using automatic differentiation

Notifications You must be signed in to change notification settings

nlw0/filterSqp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

filterSqp

Implementation of the FilterSQP constrained optimization algorithm. Using the FADBAD++ library for automatic differentiation.

2015/Sep - This is still very much WIP, and there is nothing related to constrained optimization yet. Just implementing trust-region optimization first, adapting my old Python code from the Corisco project.

But we do have preliminary results. Here are some basic demos of the solution of classic functions.

Himmelblau function optimization from different starting points, using a fixed step length trust region method

On the Rosenbrock "banana" function": Gradient descent, smooth but slow, too many iterations on the valley.

Newton's method. Faster but clumsy, better to have a smoother track.

Fixed step trust region with same starting point from Newton's method. Much smoother, following the valley.

Fixed step trust region with same starting point from gradient descent. Very similar track, but far fewer iterations.

About

(Incomplete) Implementation of the FilterSQP constained optimization algorithm using automatic differentiation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages