Custom Optimizer in PyTorch

For a project that I have started to build in PyTorch, I would need to implement my own descent algorithm (a custom optimizer different from RMSProp, Adam, etc.). In TensorFlow, it seems to be possible to do so (https://towardsdatascience.com/custom-optimizer-in-tensorflow-d5b41f75644a) and I would like to know if it was also the case in PyTorch.

I have tried to do it by simply adding my descent vector to the leaf variable, but PyTorch didn’t agree: “a leaf Variable that requires grad has been used in an in-place operation.”. When I don’t do the operation in-place, the “new” variable loses its position of leaf, so it doesn’t work neither…

Is there an easy way to create such a custom optimizer in PyTorch?

Thanks in advance :slight_smile:

2 Likes

You may want to look at that post: Regarding implementation of optimization algorithm :slight_smile:

Thanks for pointing it out, I will try to do what they suggest :slight_smile:

Hi Artix !

You can write your own update function for sure !
To update your weights, you might use the optimiser library. But you
can also do it yourself. For example, you can basically code the
gradient descent, the SGD or Adam using the following code.

net = NN()
learning_rate = 0.01
for param in net.parameters():
weight_update = smth_with_good_dimensions
param.data.sub_(weight_update * learning_rate)

As you can see, you have access to your parameters in net.parameters(), so you can update them like you want.

If you want more specific examples, you can go here where I implemented both SVRG and SAGA : GitHub - kilianFatras/variance_reduced_neural_networks: Implementation of SVRG and SAGA optimization algorithms for deep learning topics. (variance reduced algorithmes) ! If you have any further question, do not hesitate :slight_smile:

5 Likes

Thank you so much, that’s exactly what I needed :grinning: !

1 Like

Is these code is useful for image classification also?