Numerical Optimisation
-
Gradient-Based Algorithms
A brief overview on the theory underpinning gradient-based optimisation algorithms (steepest descent and conjugate gradients) and their implemention.
-
Quasi-Newton Methods
An overview of the BFGS and L-BFGS minimisation algorithms along with inexact line search algorithms, and their implementation.
-
Forward-Mode Automatic Differentiation
An introduction to forward-mode automatic differention and its implementation using operator overload in Python.
-
Reverse-Mode Automatic Differentiation
We implement reverse-mode automatic differention in Python and introduce some graph theory tools.
-
Putting it all Together: Linear Regression
We take all the components we built above to do linear regression (which is a little bit of overkill for this problem).