Is this expected behavior?

so the following bit of code:

    try:
        loss = criterion(output, Variable(torch.LongTensor(category_tensor).view(1, -1), requires_grad=False))
    except:
        print('Category Tensor: {}'.format(category_tensor))
        print(torch.LongTensor(category_tensor).view(1, -1), output)

Gives me the following printed message:

Category Tensor: 1

 5.7646e+18
[torch.LongTensor of size 1x1]
 Variable containing:

Columns 0 to 9 
-0.0730 -5.6726 -5.3783 -5.5711 -5.3906 -5.5028 -5.5235 -5.4769 -5.3868 -5.6045

Columns 10 to 17 
-5.5530 -5.5870 -5.5429 -5.4055 -5.2826 -5.4653 -5.5103 -5.5018
[torch.FloatTensor of size 1x18]

Is that expected? Im not sure what’s happening here

I guess to answer my own question, if I want to create a LongTensor of 1x1, I have to explicitly define it. This is probably just allocating memory ala np.empty?

If you want to create a 1x1 Tensor that contains a 1 you should do torch.LongTensor([[1]]).