Update parameter with differentiable gradient

I’m on a kind of meta-learning project and I want to update a parameter (theta) of a module with a gradient (d_loss(theta, x)/d_theta) that is differentiable wrt x.

I could get the differentiable gradient using autograd.grad, but I see no way to update my parameter by hand without detaching the gradient.

for example:

module = nn.Linear(4,2)
loss = (module(input)*x).sum()
grad = autograd.grad(loss, (module.weight), retain_graph=True)[0]
# this does not work:
module.weight = module.weight - alpha*grad

Is there any simple way to do that?

Another solution to my problem would be the equivalent of an nn.Module but with tensors instead of parameters (I just don’t want to re-implement a conv2d mechanism by hand with tensors)

Solved: nn.functional is exactly what I need. Sorry for the topic.