[From AlexC in GC]
Currently it is only possible to set betas for ADAMW optimizer. We would like to do it also for LAMB.
- Rename training argument from
adam_beta1 and adam_beta2 to optimizer_beta1 and optimizer_beta2
- In IPUTrainer.create_optimizer pass the value of the above parameters to
betas as part of optimizer_kwargs
Bonus point: while you are at it you could rename the parameter adam_epsilon to optimizer_epsilon, since this value is also used by LAMB.