Maybe this is a silly question, but how can we sum over multiple dimensions in pytorch?

In numpy, np.sum() takes a axis argument which can be an int or a tuple of ints, while in pytorch, torch.sum() takes a dim argument which can take only a single int.
Say I have a tensor of size 16 x 256 x 14 x 14, and I want to sum over the third and fourth dimensions to get a tensor of size 16 x 256. In numpy, one can do np.sum(t, axis=(2, 3)), what is the pytorch equivalent?

numpyâ€™s sum() is like tensorflowâ€™s reduce sum. Unfortunately right now pytorchâ€™s sum doesnâ€™t support summing over an axis (itâ€™s being worked on, though!) but there are ways to be creative with this.

One way to do this is, depending on the number of dimensions your tensor has, do this by hand with for loops, after transposing the matrix so the dimension you want to reduce over is last. For example:

import torch
x = torch.randn(2, 2, 2)
# Let's say I want to sum over the 1st dimension (0-indexed)
x = x.transpose(1, 2)
torch.Tensor([ [ z.sum() for z in y ] for y in x])