3

The requires_grad of tensor b and c are True. But the requires_grad of tensor d is False. I am curious why this change happens because all the requires_grad of inputs are True.

However, the requires_grad of tensor e is True. I can still do backward() on e. But is there an error in this way?

I am using Python3.7 and Pytorch1.1.

import torch import torch.nn as nn net = nn.Conv2d(1, 1, 3, padding=1) a = torch.randn(1, 1, 10, 10) b = net(a) c = net(b) d = torch.gt(b, c) e = b - c e[e > 0] = 1.0 e[e < 0] = 0.0 

1 Answer 1

2

I assume this is because you cannot take a gradient of greater than operation. The return type is boolean:

>>> torch.gt(torch.tensor([[1, 2], [3, 4]]), torch.tensor([[1, 1], [4, 4]])) tensor([[False, True], [False, False]]) 

Whereas minus or other arithmetic operation returns another numeral.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.