Take gradient of loss with respect to weight

I am wondering how I can take gradient of residual and BCs losses w.r.t. weights?
class model(nn.Module):
def init(self):
super(model, self).init()
self.layers = nn.Sequential(
nn.Linear(input_n,h_n),
Swish(),
nn.Linear(h_n,h_n),
Swish(),
nn.Linear(h_n,h_n),

def forward(self,x):
output = self.layers(x)
return output
optimizer = optim.Adam(model.parameters(), lr=learning_rate, betas = (0.9,0.99),eps = 10**-15)

loss = loss_res + Lambda_BC* loss_bc
loss.backward()

I extracted the weights of the model with this approach:
for layers in model.modules():
if isinstance(layers, nn.Linear):
weight = layers.weight

Do I need to do something like this:
weight.require_grads = True
loss_res_grad= torch.autograd.grad(loss_res, weight)[0]

Thanks,
Mari