Weird behavior between tensor of size n and tensor of size n x 1

Hi there,

During the creation of one of my networks I stumbled across something which I found rather weird. I wanted to check the accuracy of my network by doing

Pred == Labels

I would expect to get back a 1D tensor with the size of my batch. Instead I got a tensor of size batch x batch. I don’t really get why this happens though. When looking deeper I found that Pred would be a tensor of size batch and Labels a tensor of size (batch x 1). I would’ve thought that a tensor of size n and a tensor of size (n x 1) would get treated the same but apparently it isn’t. Can anyone tell me why this is?

If you want to test this yourself try this piece of code

import torch
a = torch.randn(16)
b = torch.randn((16,1))

a.random_(1,16) #just to get some values the same
b.random_(1,16)

a == b # weird?
a == b.view(16) #expected output

The way i had to solve this was by reshaping the Labels by using view to drop that extra dimension (see example code).

Thanks in advance!

Look at the broadcasting semantics on the docs: http://pytorch.org/docs/master/notes/broadcasting.html

You have two tensors that don’t have the same size, so the broadcasting semantics comes in.

First, (16) is compared to (16, 1) in the trailing dimension first. 1 can be broadcasted to 16 so the tensors are broadcastable. What happens (according to the doc) is that extra empty dimensions are prepended to the smaller tensor, so it effectively has size (1, 16).

Now, comparing (1, 16) with (16, 1) results in a result tensor with size (16, 16), which is what you’re getting.

Ow okay! Thank you for your answer! I get it now :slight_smile: .