Loss.backward() failure due to contiguous issue

Or just generate a random input of the same shape that the previous layer gets.
Then forward the previous layer, l10 and the next layer after it to be safe.
Then do .sum().backard() on the output.
That will run a backward pass on these three layers to see if it works.