Which function is better for upsampling: upsampling or interpolate?

I am using the upsampling function for semantic segmentation. It worked in 0.4, but for the 0.4.1 I got the warning

/home/john/anaconda3/lib/python3.6/site-packages/torch/nn/modules/upsampling.py:122: UserWarning: nn.Upsampling is deprecated. Use nn.functional.interpolate instead.
warnings.warn(“nn.Upsampling is deprecated. Use nn.functional.interpolate instead.”)

I am wondering that which function is better: interpolate or upsampling? Why do we make a new function like upsampling although upsampling still works fine?

1 Like

Sees to be just for easy reading…


It just mentioned

This function is deprecated in favor of :func:torch.nn.functional.interpolate. This is equivalent with nn.functional.interpolate(...).

So Are they same? Thanks

The warning message came from https://github.com/pytorch/pytorch/blob/ba5d33bedeec57bb35cf27a9d6023c8b3676dda5/torch/nn/modules/upsampling.py#L121.

I think you may ignore the warning if you are just using nn.Upsample as a layer.

1 Like

So do you think nn.functional.interpolate and nn.Upsampling provide same result?

Yes, I do. nn.functional.interpolate contains the functionality of nn.functional.upsample_bilinear and nn.functional.upsample_nearest as well as nn.Upsample (or nn.functional.upsample) now.

IMO, actually, the warning message is inserted wrong. Because nn.Upsample is just a layer and not a function, the warning message is weird.

You can check the difference of implementation of nn.Upsample between in 0.4.0 (Top) and in 0.4.1 (bottom).

    # 0.4.0
    def forward(self, input):
        return F.upsample(input, self.size, self.scale_factor, self.mode, self.align_corners)

    # 0.4.1
    def forward(self, input):
        warnings.warn("nn.Upsampling is deprecated. Use nn.functional.interpolate instead.")
        return F.interpolate(input, self.size, self.scale_factor, self.mode, self.align_corners)

This PR seems to have been rejected? It’s a bit hard to assess, since it’s all being done on phabricator I guess. But the deprecation warning is still present in master.
I’d hate to have to rewrite a perfectly suitable Sequential into a set of Sequentials with a custom forward just to get rid of this.

What’s the status on this? Will those modules be un-deprecated or renamed?

1 Like

You could probably just wrap F.interpolate into a nn.Module ans still use it in your sequential model. Have a look at this post for an example.

1 Like

I guess I could. But I’d only do if it is confirmed that Upsample is actually going to be deprecated.
By the way, I’d never think to search for “interpolate” in the docs if I want to change a tensor’s size.
I’d probably search for resize, rescale, upsample, downsample, scale, more or less in that order, then ask here. I get the meaning behind it of course, I just think it makes discovery harder than it needs to be. But time will tell that I guess.

1 Like

Yeah, you feedback makes sense. It’s sometimes a bit hard to see if some methods are not named optimally, if you’re working a lot with the framework.
While I know that you can use F.interpolate for your use case, I clearly see that I’m kind of biased towards the current naming.

I just need a layer, not a function.

If nn.Upsample is deprecated, i cannot find any layer to replace it.

F.upsample and F.interpolate may be similar, but there is no nn.Interpolate, and I don’t know why.

Did you try the method posted here?