3

I have this code :

from torch.autograd import Variable d_real_data = Variable(d_sampler(d_input_size)) 

But I wonder what is the difference between Variable(d_sampler(d_input_size)) and d_sampler(d_input_size)

I think it is two tensors but the values are different. So I was wondering what is the goal of this function Variable ?

1 Answer 1

7

Variable() was a way to to use autograd with tensors. This is now deprecated and should not be used anymore. Tensors now work fine with autograd if the requires_grad flag is set to true. From the official docs

The Variable API has been deprecated: Variables are no longer necessary to use autograd with tensors. Autograd automatically supports Tensors with requires_grad set to True.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.