Gradient clipping for one of two losses when using AMP

Hi !
I use two losses during training, and I want to clip the gradient produced by one of the two.
When using mixed-precision, is the following script the best way to handle this problem:

import torch
import torch.nn as nn
torch.manual_seed(0)
tens = torch.randn(32, 100, device='cuda')
gt = torch.randn(32, 100, device='cuda')
model = nn.Linear(100, 100).cuda()
l1 = nn.L1Loss().cuda()
l2 = nn.MSELoss().cuda()
scaler = torch.cuda.amp.GradScaler()
opt = torch.optim.SGD(model.parameters(), lr=0.1)

model.zero_grad()
with torch.cuda.amp.autocast():
  out = model(tens)
  loss1 = l1(gt, out)
  loss2 = l2(gt, out)
scaler.scale(loss1).backward(retain_graph=True)
scaler.unscale_(opt)
torch.nn.utils.clip_grad_norm_(model.parameters(), 0.1)
model.weight.grad = scaler.scale(model.weight.grad)
scaler.update()
scaler.scale(loss2).backward()
scaler.unscale_(opt)
scaler.update()

I am unsure about this script as I have to manually scale the gradient, and perform un update on the scaler before calling .step

Thanks for your help !