F.mse_loss non symetric?

Hello,

I witnessed a strange behavior recently using F.mse_loss.

Here’s the test I ran:

import torch 
import torch.nn as nn 
import torch.nn.functional as F 


layer = nn.Linear(1,3)

x = torch.rand(1,1)
label = torch.rand(1,3)
out = layer(x)

print('Input: {}\nLabel: {}\nResult: {}'.format(x, label, out))


loss_1 = F.mse_loss(out, label)
loss_2 = F.mse_loss(label, out)

print('Loss1: {}\nLoss2: {}'.format(loss_1, loss_2))

Output:

Input: tensor([[0.6389]])
Label: tensor([[0.9091, 0.5892, 0.8812]])
Result: tensor([[ 0.2329, -0.2419, -0.5444]], grad_fn=<ThAddmmBackward>)
Loss1: 1.060153603553772
Loss2: 3.1804609298706055

Am I missing something here ?

Thanks !

Hi,

I can’t reproduce that, I get exact same values for both on my machine.

  • Which version of pytorch are you using?
  • How did you installed pytorch?

I think I remember issues where mm operations were not behaving properly for some wrongly installed/incompatible blas libraries.

I can reproduce it in 0.4.1 installed using conda.

Yep, this is a bug in 0.4.1. Sorry about it. It is fixed on master.

Indeed, I installed it through conda.

Perfect ! Thanks for fixing it !