Optimization approch

Hi,
I am implementing neural style transfer. So for optimising loss i update input image which is a torch.FloatTensor. Till now i was explicitly performing optimisation by multiplying gradient with learning rate as i cannot use the inbuilt optimizer such as Adam as the argument to optimizer should be iterable or dict of tensor. Is there any way to use inbuilt Adam for this task

Hi,

If you have a single parameter, you can pass it to the optimizer in a list that contains a single element (this single parameter).

1 Like