PyTorch Forums
Is it correct way to do cross channel normalization?
ptrblck
March 16, 2019, 12:49pm
2
You could probably wrap in in a
with torch.no_grad(): ...
block instead as described
here
.
show post in topic