Updating weight parameters in custom nn layer

I am new to pytorch and I have written a custom nn layer. I have two weight parameters which I have declared in the __init__ function as follows.

self.weight_forward = nn.Parameter(torch.Tensor(self.length, self.config.emsize))
self.weight_backward = nn.Parameter(torch.Tensor(self.length, self.config.emsize))

Everything is working fine. I just want to know when I call loss.backward(), whether these weight parameters get updated with other network parameters?

Please note, this custom layer is a part of my full model and working as per my expectation. I just want to make sure when pytorch does the backpropagation, whether it considers these weight parameters in the computational graph or not!

Your parameters will be part of the computational graph if you used them to compute the [loss] variable you start backpropagation from. There are some special cases when backpropagation is not being performed in some sub-graph. You can read more in the docs.

But parameters do not get updated during backpropagation. Only their gradients are being changed (the new gradients are added to the existing values) when calling backward(). Parameters are usually updated afterwards by an optimizer based on the gradients computed during backpropagation.