Here, we have some of my attempts to interpret the field of optimization

  1. Introduction
    • Convex Sets
    • The Basic of Optimization
    • Unconstrained Optimization
    • Convex Optimization
    \[\]
  2. Categorization of Algorithms
    • Linear Search
    • Trust Regions
    \[\]
  3. Theory of Constrained Optimization

    \[\]
  4. Linear Search
    • Wolf Conditions and Goldstein Conditions
    • Quasi-Newton Methods
    • Newton’s Method
    • Coordinate Descent Methods
    • Step-Length Selection Algorithms
    \[\]
  5. Rate of Convergence

    \[\]
  6. Trust Region Methods
    • The Cauchy Point and Related Algorithms
    • Improving on the Cauchy Point
    • The Dogleg Method
    \[\]
  7. Conjugate Gradient Methods
    • Conjugate Gradient Direction
    • Properties
    • A Practical Form of it
    \[\]
  8. Practical Newton Methods
    • Line Search Newton Methods
    • Line Search Newton–CG Method
    • Trust-Region Newton Methods
    \[\]
  9. Calculating Derivatives

    \[\]
  10. Quasi-Newton Methods
    • The BFGS Method
    • Properties of the BFGS Method
    • Global Convergence of the BFGS Method
    \[\]
  11. Nonlinear Least-Squares Problems

    \[\]

UNDER CONSTRUCTION