How to adjust Learning Rate? - DonghoonPark12/ssd.pytorch GitHub Wiki
- step_size ์ํญ ๋ง๋ค lr์ gamma๋งํผ ๊ฐ์์ํค๊ฒ ๋ค๋ ์๋ฏธ์ด๋ค.
from torch.optim.lr_scheduler import StepLR
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
# lr = 0.05 if epoch < 30
# lr = 0.005 if 30 <= epoch < 60
# lr = 0.0005 if 60 <= epoch < 90
- ๊ตฌ๊ฐ์ ์ ํด ์ค ์๋ ์๋ค.
from torch.optim.lr_scheduler import MultiStepLR
scheduler = MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05 if epoch < 30
# lr = 0.005 if 30 <= epoch < 80
# lr = 0.0005 if epoch >= 80
- Keras์์ ์ฐ๋ ๋ฐฉ๋ฒ์ ๊ทธ๋๋ก ์ฌ์ฉํ ์๋ ์๋ค. ์ ํ๋๊ฐ ๋ ํฅ์๋์ง ์์ผ๋ฉด lr์ ๊ฐ์์ํจ๋ค. mode๋ฅผ ์ ํด ์ค ์๋ ์์ผ๋ฉฐ threshold ๊ฐ๋ ์์ ์ด ๊ฐ๋ฅํ๋ค(detail).
์ ํ ์ฝ๋๋ฅผ ๋ณด๋ฉด, ๋ฐ๋์ validation ์ฒดํฌ๋ฅผ ํ ๋ค์์ scheduler๋ฅผ ํธ์ถํด์ผ ํ๋ค๊ณ ํ๋ค.
from torch.optim.lr_scheduler import ReduceLROnPlateau
scheduler = ReduceLROnPlateau(optimizer, 'min')
for epoch in range(10):
train(...)
val_loss = validate(...)
# Note that step should be called after validate()
scheduler.step(val_loss)
[Refer]
https://pytorch.org/docs/stable/optim.html
https://pytorch.org/docs/stable/optim.html#torch.optim.lr_scheduler.LambdaLR