SAGA: Introduction to Variance Reduction
Date:
This seminar is about a recursive framework for improving convergence performance in expectation on convex stochastic optimization. By replacing the gradient of the reference point with the last iterate, the stochastic average gradient algorithm (SAGA) saves more computational resource with linear convergence, and supports for composite objectives where a proximal operator is used on the regularizer, compared with stochastic variance reduced gradients (SVRG).
Seminar slides: [slides].
Reference Reading Materials:
- Defazio, Aaron, Francis Bach, and Simon Lacoste-Julien. “SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives.” Advances in neural information processing systems 27 (2014).[pdf].