2

I'm new to PyTorch. I learned it uses autograd to automatically calculate the gradients for the gradient descent function.

Instead of adjusting the weights, I would like to mutate the input to achieve a desired output, using gradient descent. So, instead of the weights of neurons changing, I want to keep all of the weights the same and just change the input to minimize the loss.

For example. The network is a trained image classifier with the numbers 0-9. I input random noise, and I want to morph it so that the network considers it a 3 with 60% confidence. I would like to utilize gradient descent to adjust the values of the input (originally noise) until the network considers the input to be a 3, with 60% confidence.

Is there a way to do this?

2
  • That is not possible with just one network. You will need something like GAN (Generative adversarial network), which is a combination of 2 networks, to make this work. Commented Sep 12, 2020 at 20:37
  • 1
    Yes, you can. Take a look at this official tutorial. Commented Sep 13, 2020 at 1:18

1 Answer 1

3

I assume you know how to do regular training with gradient descent. You only need to change the parameters to be optimized by the optimizer. Something like

# ... Setup your network, load the input # ... # Set proper requires_grad -> We train the input, not the parameters input.requires_grad = True for p in network.parameters(): p.requires_grad = False # Setup the optimizer # Previously we should have SomeOptimizer(net.parameters()) optim = SomeOptimizer([input]) output_that_you_want = ... actual_output = net(input) some_loss = SomeLossFunction(output_that_you_want, actual_output) # ... # Back-prop and optim.step() as usual 
Sign up to request clarification or add additional context in comments.

1 Comment

What if we are using a dataloader? It will be a bit complicated cause idk if we have to add the inputs to the optimizer inside the training loop or before that?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.