I am writing my own loss function in which I want to use ‘histogram2d’ function in numpy, but I am wondering if the operations will be detected by autograd?
def __call__(self, pred, ref):
hist_2d, _, _ = np.histogram2d(pred.cpu().detach().numpy().ravel(),
nmi = self.normalized_mutual_information(torch.from_numpy(hist_2d).float().requires_grad_().cuda())
return 1 - nmi
def normalized_mutual_information(self, hgram):
Thank you for your quick response!
I do not really understand your first point. Does it mean the function name within the loss class has to be forward()? I use call to make it called automatically as there are more than one fuction in the loss class.
second, if I implement the histogram2d by myself on the tensor, I do not have to write the backward by myself, right?
From the code you sent, your loss class is a subclass of
nn.Module. As you can see in the doc you should only implement
forward and use it by calling it on inputs as:
mod = YourMod(init_args)
out = mod(input) # This will call .forward() + do other stuff needed by pytorch
__call__ method is implemented on the base class and should not be override.
If you implement this method using only pytorch’s methods, the autograd will give you the gradients yes.