Pytorch super weird dimensions

I have this simple network using lstm that i am trying to debug. What I do now is simply check the l2 norm is 0. But when I print the shape i dont get a scalar. I get a vector with crazy dimensions.

The input is of size 4x50x28. linalg l2 norm returns a value of 0 which is correct. But why is the shape (1,1,1…) ??

What is wrong with pytorch ?

class Net(nn.Module):
    def __init__(self, feature_dim, hidden_dim, batch_size):
        super(Net, self).__init__()
        
        # lstm architecture
        self.hidden_size=hidden_dim
        self.input_size=feature_dim  
        self.batch_size=batch_size
        self.num_layers=1
        
        # initialize hidden and cell
        self.hn = Variable(torch.randn(self.num_layers, self.batch_size, self.hidden_size))
        self.cn = Variable(torch.randn(self.num_layers, self.batch_size, self.hidden_size))
        
        # lstm
        self.lstm = nn.LSTM(feature_dim, hidden_size=self.hidden_size, num_layers=self.num_layers, batch_first=True)
        
        # fc layers
        self.fc1 = nn.Linear(hidden_dim, 2)    
                
    def forward(self, x, mode=False):
                 
        h0 = self.hn
        c0 = self.cn
        
        print(np.shape(x))
        
        # step through the sequence one timestep at a time        
        for (i,xt) in enumerate(torch.t(x)): 
                output, (h0,c0) = self.lstm(xt[:,None,:], (h0,c0))
                print(np.linalg.norm(xt - x[:,i,:]))
                 
        output = self.fc1(output[:,-1,:])
        return output

torch.Size([4, 50, 28])
[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[Variable containing:
0
[torch.FloatTensor of size 1]
]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]
(1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1)

Hi,

The problem is that np functions don’t work with torch Tensors very nicely.
You have a pytorch norm function that you can use: torch.norm() or (xt - x[:,i,:]).norm().