How to adjust Learning Rate? - DonghoonPark12/ssd.pytorch GitHub Wiki

  1. step_size ์—ํญ ๋งˆ๋‹ค lr์„ gamma๋งŒํผ ๊ฐ์†Œ์‹œํ‚ค๊ฒ ๋‹ค๋Š” ์˜๋ฏธ์ด๋‹ค.
from torch.optim.lr_scheduler import StepLR
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
# lr = 0.05     if epoch < 30
# lr = 0.005    if 30 <= epoch < 60
# lr = 0.0005   if 60 <= epoch < 90
  1. ๊ตฌ๊ฐ„์„ ์ •ํ•ด ์ค„ ์ˆ˜๋„ ์žˆ๋‹ค.
from torch.optim.lr_scheduler import MultiStepLR
scheduler = MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05     if epoch < 30
# lr = 0.005    if 30 <= epoch < 80
# lr = 0.0005   if epoch >= 80
  1. Keras์—์„œ ์“ฐ๋˜ ๋ฐฉ๋ฒ•์„ ๊ทธ๋Œ€๋กœ ์‚ฌ์šฉํ•  ์ˆ˜๋„ ์žˆ๋‹ค. ์ •ํ™•๋„๊ฐ€ ๋” ํ–ฅ์ƒ๋˜์ง€ ์•Š์œผ๋ฉด lr์„ ๊ฐ์†Œ์‹œํ‚จ๋‹ค. mode๋ฅผ ์ •ํ•ด ์ค„ ์ˆ˜๋„ ์žˆ์œผ๋ฉฐ threshold ๊ฐ’๋„ ์ˆ˜์ •์ด ๊ฐ€๋Šฅํ•˜๋‹ค(detail).
    ์…ˆํ”Œ ์ฝ”๋“œ๋ฅผ ๋ณด๋ฉด, ๋ฐ˜๋“œ์‹œ validation ์ฒดํฌ๋ฅผ ํ•œ ๋‹ค์Œ์— scheduler๋ฅผ ํ˜ธ์ถœํ•ด์•ผ ํ•œ๋‹ค๊ณ  ํ•œ๋‹ค.
from torch.optim.lr_scheduler import ReduceLROnPlateau
scheduler = ReduceLROnPlateau(optimizer, 'min')
for epoch in range(10):
    train(...)
    val_loss = validate(...)
    # Note that step should be called after validate()
    scheduler.step(val_loss)

[Refer]
https://pytorch.org/docs/stable/optim.html
https://pytorch.org/docs/stable/optim.html#torch.optim.lr_scheduler.LambdaLR