Examples
- 1. Unconstrained convex minimization
- 1.1. Gradient descent
- 1.2. Subgradient method
- 1.3. Subgradient method under restricted secant inequality and error bound
- 1.4. Gradient descent with exact line search
- 1.5. Conjugate gradient
- 1.6. Heavy Ball momentum
- 1.7. Accelerated gradient for convex objective
- 1.8. Accelerated gradient for strongly convex objective
- 1.9. Optimized gradient
- 1.10. Optimized gradient for gradient
- 1.11. Robust momentum
- 1.12. Triple momentum
- 1.13. Information theoretic exact method
- 1.14. Proximal point
- 1.15. Accelerated proximal point
- 1.16. Inexact gradient descent
- 1.17. Inexact gradient descent with exact line search
- 1.18. Inexact accelerated gradient
- 1.19. Epsilon-subgradient method
- 1.20. Gradient descent for quadratically upper bounded convex objective
- 1.21. Gradient descent with decreasing step sizes for quadratically upper bounded convex objective
- 1.22. Conjugate gradient for quadratically upper bounded convex objective
- 1.23. Heavy Ball momentum for quadratically upper bounded convex objective
- 2. Composite convex minimization
- 2.1. Proximal gradient
- 2.2. Accelerated proximal gradient
- 2.3. Bregman proximal point
- 2.4. Douglas Rachford splitting
- 2.5. Douglas Rachford splitting contraction
- 2.6. Accelerated Douglas Rachford splitting
- 2.7. Frank Wolfe
- 2.8. Improved interior method
- 2.9. No Lips in function value
- 2.10. No Lips in Bregman divergence
- 2.11. Three operator splitting
- 3. Non-convex optimization
- 4. Stochastic and randomized convex minimization
- 5. Monotone inclusions and variational inequalities
- 6. Fixed point
- 7. Potential functions
- 8. Inexact proximal methods
- 9. Adaptive methods
- 10. Low dimensional worst-cases scenarios
- 11. Continuous-time models
- 12. Tutorials