Stochastic descent optimisation in Matlab
Using the Adam optimiser
21st February, 2017Stochastic gradient descent is a powerful tool for optimisation, which relies on estimation of gradients over small, randomly-selected batches of data. This approach is efficient (since gradients only need to be evaluated over few data points at a time) and uses the noise inherent in the stochastic gradient estimates to help get around local minima. This is a Matlab
implementation of a recent powerful SGD algorithm.