評價此頁

LSTM#

class torch.ao.nn.quantizable.LSTM(input_size, hidden_size, num_layers=1, bias=True, batch_first=False, dropout=0.0, bidirectional=False, device=None, dtype=None, *, split_gates=False)[原始碼]#

可量化的長短期記憶(LSTM)模型。

有關描述和引數型別,請參閱 LSTM

變數

layers_LSTMLayer 的例項

注意

要訪問權重和偏置,您需要按層訪問它們。請參閱下面的示例。

示例

>>> import torch.ao.nn.quantizable as nnqa
>>> rnn = nnqa.LSTM(10, 20, 2)
>>> input = torch.randn(5, 3, 10)
>>> h0 = torch.randn(2, 3, 20)
>>> c0 = torch.randn(2, 3, 20)
>>> output, (hn, cn) = rnn(input, (h0, c0))
>>> # To get the weights:
>>> print(rnn.layers[0].weight_ih)
tensor([[...]])
>>> print(rnn.layers[0].weight_hh)
AssertionError: There is no reverse path in the non-bidirectional layer