Optimization for AI
Here, we have some of my attempts to interpret the field of optimization
- Introduction
- Convex Sets
- The Basic of Optimization
- Unconstrained Optimization
- Convex Optimization
- Categorization of Algorithms
- Linear Search
- Trust Regions
-
Theory of Constrained Optimization
\[\] - Linear Search
- Wolf Conditions and Goldstein Conditions
- Quasi-Newton Methods
- Newton’s Method
- Coordinate Descent Methods
- Step-Length Selection Algorithms
-
Rate of Convergence
\[\] - Trust Region Methods
- The Cauchy Point and Related Algorithms
- Improving on the Cauchy Point
- The Dogleg Method
- Conjugate Gradient Methods
- Conjugate Gradient Direction
- Properties
- A Practical Form of it
- Practical Newton Methods
- Line Search Newton Methods
- Line Search Newton–CG Method
- Trust-Region Newton Methods
-
Calculating Derivatives
\[\] - Quasi-Newton Methods
- The BFGS Method
- Properties of the BFGS Method
- Global Convergence of the BFGS Method
-
Nonlinear Least-Squares Problems
\[\]
UNDER CONSTRUCTION