Hi all.
I want to change the input so it minimizes the loss (rather than changing the trainable variables ). in other words i want update input instead of weight and bias.
I use this code for iteration:
Variables are deprecated since PyTorch 0.4 so you should use tensors now.
You would have to pass the input tensor to an optimizer, so that it can update the input (similar like you pass the model.parameters() to an optimizer).
The same work flow applies as usual, i.e. you would zero out the gradients, perform the forward and backward pass, and call optimizer.step() to update the input.
thanks for reply.
I dont understand how to use tensor instead of Variable ? how to define ( requires_grad=True) for input?
I have another question, when I want gradient with respect to input and not change W and B, do I set param.requires_grad = False or True.
(for param in model.parameters():
param.requires_grad = False )
when I update w and b is true, but I dont know here.