ICLR 2019 | ‘Fast as Adam & Good as SGD’— New Optimizer Has Both
A paper recently accepted for ICLR 2019 challenges this with a novel optimizer — AdaBound — that authors say can train machine learning models “as fast as Adam and as good as SGD.” Basically, AdaBound is an Adam variant that employs dynamic bounds on learning rates to achieve a gradual and smooth transition to SGD.