MSELoss#
- class torch.nn.MSELoss(size_average=None, reduce=None, reduction='mean')[原始碼]#
建立一個標準,用於衡量輸入 和目標 之間每個元素的平均平方誤差(平方 L2 範數)。
未約簡的(即
reduction設定為'none')損失可以描述為其中 是批次大小。如果
reduction不是'none'(預設為'mean'),則:and are tensors of arbitrary shapes with a total of elements each.
The mean operation still operates over all the elements, and divides by .
可以透過將
reduction = 'sum'來避免除以 。- 引數
size_average (bool, optional) – 已棄用 (參見
reduction)。預設情況下,損失值在批次中的每個損失元素上取平均值。請注意,對於某些損失,每個樣本有多個元素。如果欄位size_average設定為False,則損失值在每個小批次中而是求和。當reduce為False時忽略。預設值:Truereduce (bool, optional) – 已棄用 (參見
reduction)。預設情況下,損失值在每個小批次中根據size_average對觀測值進行平均或求和。當reduce為False時,返回每個批次元素的損失值,並忽略size_average。預設值:Truereduction (str, optional) – Specifies the reduction to apply to the output:
'none'|'mean'|'sum'.'none': no reduction will be applied,'mean': the sum of the output will be divided by the number of elements in the output,'sum': the output will be summed. Note:size_averageandreduceare in the process of being deprecated, and in the meantime, specifying either of those two args will overridereduction. Default:'mean'
- 形狀
輸入: ,其中 表示任意數量的維度。
目標:,與輸入形狀相同。
示例
>>> loss = nn.MSELoss() >>> input = torch.randn(3, 5, requires_grad=True) >>> target = torch.randn(3, 5) >>> output = loss(input, target) >>> output.backward()