Can I use torch.nn.L1Loss to compute L1 Norm between two tensors in the middle of y forward pass?

I wonder if I can use torch.nn.L1Loss as L1 function, in other words

criterion = torch.nn.L1Loss(reduction=none)
similarity = criterion(x,y)
model(similarity)

where both x,y have reqires_grad=True, I am not sure if it will automatically make y to have requires_grad=False or not?

Hi Seyeeet!

Yes, this will work fine. L1Loss will propagate gradients back
through both of its arguments without issue.

As an aside, stylistically, I would probably write this as

model ((x - y).abs())

To my eye, that’s a little more readable (but it does the same thing).

Best.

K. Frank

1 Like