Strange behavior of functional.interpolate during training

Hi. This is the line of code that leads to strange behavior.

        labels1 = functional.interpolate(labels, size=24, mode='bilinear')

        print("### labels.long().min()", labels.long().min())
        print("### labels.min()", labels.min())

        print("### labels1.long().min()", labels1.long().min())
        print("### labels1.min()", labels1.min())

I got,

### labels.long().min() tensor(-1, device='cuda:2')
### labels.min() tensor(-1., device='cuda:2')
### labels1.long().min() tensor(0, device='cuda:2')
### labels1.min() tensor(0., device='cuda:2')

What’s wrong with functional.interpolate here? Why did the labels.long().min() differ from labels1.min()? Thanks.

Double post from here.