1

I am learning pytorch and write a simple code as below.

import torch x = torch.randn(3,requires_grad=True).cuda() print(x) y = x * x print(y) y.backward(torch.tensor([1,1.0,1]).cuda()) print(x.grad) 
tensor([ 0.5934, -1.8813, -0.7817], device='cuda:0', grad_fn=<CopyBackwards>) tensor([0.3521, 3.5392, 0.6111], device='cuda:0', grad_fn=<MulBackward0>) None 

if I change the code as

from torch.autograd import Variable import torch # x = torch.randn(3,requires_grad=True).cuda() x = Variable(torch.randn(3).cuda(),requires_grad=True) print(x) y = x * x print(y) y.backward(torch.tensor([1,1.0,1]).cuda()) print(x.grad) 
tensor([0.9800, 0.3597, 1.6315], device='cuda:0', requires_grad=True) tensor([0.9605, 0.1294, 2.6617], device='cuda:0', grad_fn=<MulBackward0>) tensor([1.9601, 0.7194, 3.2630], device='cuda:0') 

The grad is ok. But why? I hate the Variable class.

env

python:3.8

pytorch:1.5

cuda :10.2

1 Answer 1

1

I got it.

x = torch.randn(3,requires_grad=True).cuda() 

x is create by cuda(). So x is not a leaf tensor.

Change the code as below will be ok.

x = torch.randn(3,requires_grad=True,device=0) 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.