Hello all,

I was wondering whether there existed a layer that could perform upsampling in one dimension. For example, keras has the layer Upsampling1D but all the upsampling layers of pytorch seem to be for at least 2-dimensional data.

Any ideas?

Hello all,

I was wondering whether there existed a layer that could perform upsampling in one dimension. For example, keras has the layer Upsampling1D but all the upsampling layers of pytorch seem to be for at least 2-dimensional data.

Any ideas?

In the absence of 1D specific methods you could always use the 2D and unsqueeze + squeeze.

```
a = torch.randn(batch_size, num_chan, 10)
a = a.unsqueeze(dim=3)
a_up = F.upsample(Variable(a), size=(20, 1), mode='bilinear').squeeze(dim=3)
```

1 Like

I have attempted to use unsqueeze and squeeze to pretend the tensor is actually two dimensional. The bilinear filter works with the specified size but the nearest neighbor one does not (I used nn.upsample2d instead of F.upsample with size set to (x,1) and it reports that the aspect ratio isn’t respected in the nearest neighbor case). I was hoping to be able to copy the values to double the size, not interpolate them between adjacent values.

I will try using F.upsample instead of the normal 2d upsampling module and see if it makes a difference.

I think the core of both module vs functional is the same. Nearest neighbour does appear to have an additional aspect ratio constraint. Likely a bit more wasteful, but you could then scale both dimensions equally and throw away the extra dimension instead of squeezing.

`F.upsample(torch.autograd.Variable(a), size=(20,2))[:,:,:,0]`