Modern Applied Optimization
This course assumes no background in optimization. The focus will be on various classical and modern algorithms, with a view towards applications in finance, machine learning, and statistics. In the first half of the course, we will go over classical algorithms: univariate optimization and root finding (Newton, secant, regula falsi, etc.), unconstrained optimization (steepest descent, Newton, quasi-Newton, Gauss-Newton, Barzilai-Borwein, etc), constrained optimization (penalty, barrier, augmented Lagrangian, active set, etc). In the second half of the course, we will cover algorithms that have become popular over the last decade: proximal algorithms, stochastic gradient descent, variants, algorithms that involve moments or momentum or mirror, etc. Applications to machine learning and statistics will include ridge/lasso/logistic regression, support vector machines with hinge/sigmoid loss, optimal experimental designs, maximum entropy, maximum likelihood, Gaussian covariance estimation, feedforward neural networks, etc. Applications in finance will include Markowitz's classical portfolio optimization, portfolio optimization with diversification or loss risk constraints, bounding portfolio risks with incomplete covariance information, log optimal investment strategy, etc.
This course counts towards the Financial Data Science concentration.