python 用户警告:正在访问不是叶Tensor的Tensor的. grad属性

9rnv2umw  于 2023-01-01  发布在  Python
关注(0)|答案(1)|浏览(495)
import torch
from torch.autograd import Variable

x = Variable(torch.FloatTensor([11.2]), requires_grad=True)
y = 2 * x

print(x)
print(y)

print(x.data)
print(y.data)

print(x.grad_fn)
print(y.grad_fn)

y.backward() # Calculates the gradients

print(x.grad)
print(y.grad)


错误

C:\Users\donhu\AppData\Local\Temp\ipykernel_9572\106071707.py:2: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations. (Triggered internally at aten\src\ATen/core/TensorBody.h:485.)
  print(y.grad)

源代码https://github.com/donhuvy/Deep-learning-with-PyTorch-video/blob/master/1.5.variables.ipynb
如何修复?

n7taea2i

n7taea2i1#

在调用y.backward()之前先调用y.retain_grad()
原因是默认情况下PyTorch只为叶子变量(不是运算结果的变量)填充.grad,在我们的例子中是x,为了确保.grad也为非叶子变量填充,比如y,我们需要调用它们的.retain_grad()方法。
同样值得注意的是,这是一个警告,而不是错误。

相关问题