Mutual information as custom loss function

Hi all, I would like to use mutual information as a custom loss function. Is there any built-in functions that I can use to achieve that? Thanks!

2 Likes

there isn’t an inbuilt loss, but you can simply create a loss function that takes Variables x and y and computes the mutual information. The backward is automatically computed using autograd.

def mutual_info_loss(x, y):
   # implement it using torch.* operations, etc., such as torch.log(x)
   return loss

# later...
loss.backward()
2 Likes

Hi @Morpheus_Hsieh,

I need to use mutual information as a loss function as well, but I’m having difficulty. Were you able to implement?

Hi, sorry I can’t help you there. It’s been a long time and I can’t find the code. :sweat:

Hi smth,

I am facing issue in implementing mutual_info_loss using torch function. Can you please help me in implementing it. Any leads will be appreciated.
Thank you