NaN after Variable.ger()

The following code results in nan, but as far as I can tell it shouldn’t?

def batch_outer_product(x: torch.autograd.Variable, y: torch.autograd.Variable):
    result = []
    for xv, yv in zip(x, y):
        result.append(xv.ger(yv).view(1, -1))
    result = torch.cat(result, 0)
    return result

x = torch.autograd.Variable(torch.ones(50,64), requires_grad=True)
y = torch.autograd.Variable(torch.ones(50,64), requires_grad=True)

xy = batch_outer_product(x,y)
loss = xy.sum() 
print(loss)

thanks for the report. It’s a bug, and I’ve fixed it in this PR https://github.com/pytorch/pytorch/pull/1236 that i just merged into master.

It will be in the next binary release next Wednesday, or you can install the master branch via instructions here: https://github.com/pytorch/pytorch#from-source

In the meanwhile, if you want to continue using v0.1.11, then you can use this workaround:

import torch
import torch.autograd

torch.set_printoptions(profile='full')

def batch_outer_product(x: torch.autograd.Variable, y: torch.autograd.Variable):
    result = []
    for xv, yv in zip(x, y):
        out = torch.autograd.Variable(xv.data.new(xv.size(0), yv.size(0)).zero_())
        result.append(torch.addr(out, xv, yv).view(1, -1))
    result = torch.cat(result, 0)
    return result

x = torch.autograd.Variable(torch.ones(50,64), requires_grad=True)
y = torch.autograd.Variable(torch.ones(50,64), requires_grad=True)

xy = batch_outer_product(x,y)
loss = xy.sum()
print(loss)

Thank you for the quick response and also for the solution until the next release!