-
Notifications
You must be signed in to change notification settings - Fork 22
Open
Description
In the train.py there is a code of scaling the learning rate:
utils.opt.adjust_learning_rate(optimizer, epoch)
however in the comments you wrote scale it by 1/10 every 30 epoches but your scale is set to 2 by default. is this the way used in the research paper? the scale is 2 epochs or 30 epochs?
def adjust_learning_rate(optimizer, epoch, scale=2):
# Sets the learning rate to the initial LR decayed by 10 every 30 epochs
for param_group in optimizer.param_groups:
lr = param_group['lr']
lr = lr * (0.1 ** (epoch // scale))
param_group['lr'] = lr
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels