Implemented ADMM for solving convex optimization problems such as Lasso, Ridge regression
Alternating Direction Method of Multiplier is framework for solving objecting function with divide-and-conquer approach.
ADMM works in two steps
- Divide
a. Break down original problem into small problems
b. Distribute these small problem to processors / computing resources
c. Every processor solves the smaller problem - Conquer
a. Combine solution from <N> processors into one
We implemented ADMM in PyTorch framework Click here to view for Lasso and Ridge regression.
ADMM Lasso Loss | ADMM vs Scikit Lasso Solver |
ADMM Ridge regression Loss | ADMM vs Scikit ridge regression Solver |
Contour plot does shows that ADMM reaches to the optimal solution fast and then takes smaller steps as it reaches to near to the solution. Hence, it confirms that ADMM is middle solution to many problems which can solve problems nearly as fast as newton and is not just restricted to quadratic problems.
- ADMM needs distributed infrastructure to scale to general problems
- Gradient of individual small problems needs to be known in order to divide the problem
- How to divide the problem into smaller problems? a. This is reason behind we need to manually devise the smaller problem and then scale for parallalization
Click here to view Presentation
- My Convex Optimization assignments - https://github.com/bhushan23/Convex-Optimization
- Reading material - By Professor Steven Boyd - http://web.stanford.edu/~boyd/admm.html
- Implementation - By Niru Maheswaranathan - https://github.com/nirum/ADMM
- General Convex optimization problems implementation - By Stanford Convex Optimization group - https://github.com/cvxgrp/cvxpy