tensordict.nn.EnsembleModule¶
- class tensordict.nn.EnsembleModule(*args, **kwargs)¶
包裝一個模組並將其重複以形成整合(ensemble)的模組。
- 引數:
module (nn.Module) – 要複製和包裝的 nn.module。
num_copies (int) – 要製作的模組副本數量。
parameter_init_function (Callable) – 一個函式,它接受一個模組副本並初始化其引數。
expand_input (bool) – 是否擴充套件輸入 TensorDict 以匹配副本數量。除非您將 ensemble 模組連結在一起,例如 EnsembleModule(cnn) -> EnsembleModule(mlp),否則應設定為 True。如果設定為 False,EnsembleModule(mlp) 將期望之前的模組已經擴充套件了輸入。
示例
>>> import torch >>> from torch import nn >>> from tensordict.nn import TensorDictModule, EnsembleModule >>> from tensordict import TensorDict >>> net = nn.Sequential(nn.Linear(4, 32), nn.ReLU(), nn.Linear(32, 2)) >>> mod = TensorDictModule(net, in_keys=['a'], out_keys=['b']) >>> ensemble = EnsembleModule(mod, num_copies=3) >>> data = TensorDict({'a': torch.randn(10, 4)}, batch_size=[10]) >>> ensemble(data) TensorDict( fields={ a: Tensor(shape=torch.Size([3, 10, 4]), device=cpu, dtype=torch.float32, is_shared=False), b: Tensor(shape=torch.Size([3, 10, 2]), device=cpu, dtype=torch.float32, is_shared=False)}, batch_size=torch.Size([3, 10]), device=None, is_shared=False)
要將 EnsembleModules 堆疊在一起,我們應注意從第二個模組開始將 expand_input 設定為 False。
示例
>>> import torch >>> from tensordict.nn import TensorDictModule, TensorDictSequential, EnsembleModule >>> from tensordict import TensorDict >>> module = TensorDictModule(torch.nn.Linear(2,3), in_keys=['bork'], out_keys=['dork']) >>> next_module = TensorDictModule(torch.nn.Linear(3,1), in_keys=['dork'], out_keys=['spork']) >>> e0 = EnsembleModule(module, num_copies=4, expand_input=True) >>> e1 = EnsembleModule(next_module, num_copies=4, expand_input=False) >>> seq = TensorDictSequential(e0, e1) >>> data = TensorDict({'bork': torch.randn(5,2)}, batch_size=[5]) >>> seq(data) TensorDict( fields={ bork: Tensor(shape=torch.Size([4, 5, 2]), device=cpu, dtype=torch.float32, is_shared=False), dork: Tensor(shape=torch.Size([4, 5, 3]), device=cpu, dtype=torch.float32, is_shared=False), spork: Tensor(shape=torch.Size([4, 5, 1]), device=cpu, dtype=torch.float32, is_shared=False)}, batch_size=torch.Size([4, 5]), device=None, is_shared=False)