The requires_grad of tensor b and c are True. But the requires_grad of tensor d is False. I am curious why this change happens because all the requires_grad of inputs are True.
However, the requires_grad of tensor e is True. I can still do backward() on e. But is there an error in this way?
I am using Python3.7 and Pytorch1.1.
import torch import torch.nn as nn net = nn.Conv2d(1, 1, 3, padding=1) a = torch.randn(1, 1, 10, 10) b = net(a) c = net(b) d = torch.gt(b, c) e = b - c e[e > 0] = 1.0 e[e < 0] = 0.0