評價此頁

CosineAnnealingLR#

class torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max, eta_min=0.0, last_epoch=-1)[source]#

使用餘弦退火排程設定每個引數組的學習率。

The learning rate is updated recursively using

ηt+1=ηmin+(ηtηmin)1+cos((Tcur+1)πTmax)1+cos(TcurπTmax)\eta_{t+1} = \eta_{\min} + (\eta_t - \eta_{\min}) \cdot \frac{1 + \cos\left(\frac{(T_{cur}+1) \pi}{T_{max}}\right)} {1 + \cos\left(\frac{T_{cur} \pi}{T_{max}}\right)}

This implements a recursive approximation of the closed-form schedule proposed in SGDR: Stochastic Gradient Descent with Warm Restarts

ηt=ηmin+12(ηmaxηmin)(1+cos(TcurπTmax))\eta_t = \eta_{\min} + \frac{1}{2}(\eta_{\max} - \eta_{\min}) \left( 1 + \cos\left(\frac{T_{cur} \pi}{T_{max}}\right) \right)

其中

  • ηt\eta_t is the learning rate at step tt

  • TcurT_{cur} is the number of epochs since the last restart

  • TmaxT_{max} is the maximum number of epochs in a cycle

注意

Although SGDR includes periodic restarts, this implementation performs cosine annealing without restarts, so Tcur=tT_{cur} = t and increases monotonically with each call to step().

引數
  • optimizer (Optimizer) – 包裝的最佳化器。

  • T_max (int) – Maximum number of iterations.

  • eta_min (float) – Minimum learning rate. Default: 0.

  • last_epoch (int) – The index of the last epoch. Default: -1.

示例

>>> num_epochs = 100
>>> scheduler = CosineAnnealingLR(optimizer, T_max=num_epochs)
>>> for epoch in range(num_epochs):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()
../_images/CosineAnnealingLR.png
get_last_lr()[source]#

返回當前排程器計算的最後一個學習率。

返回型別

list[float]

get_lr()[source]#

Retrieve the learning rate of each parameter group.

返回型別

list[float]

load_state_dict(state_dict)[source]#

載入排程器的狀態。

引數

state_dict (dict) – scheduler state. Should be an object returned from a call to state_dict().

state_dict()[source]#

返回排程器狀態,作為一個 dict

它包含 self.__dict__ 中除最佳化器之外的所有變數的條目。

返回型別

dict[str, Any]

step(epoch=None)[source]#

執行一步。