How to use Variable in for loop

Thanks.I have a question.
I’m writing a loss function.

def forward(self, input, target):
        y = one_hot(target, input.size(-1))
        Psoft = torch.nn.functional.softmax(input).cpu()   
        Loss=0.0
        t1=target.view(1,target.size(0)).cpu()
        for i in range(0,target.size(0)-1):
            t2=t1[0,i]
            for j in range(1,t2+1):
                P1=Psoft[i,:j]
                y1=y[i,:j]
                Loss += sum(P1-y1)**2
        Loss=Loss/target.size(0)
        return Loss  

and there’ll be an error in Line:for j in range(1,t2+1):
TypeError: ‘Variable’ object cannot be interpreted as an integer

if I write it as

def forward(self, input, target):
        y = one_hot(target, input.size(-1))
        Psoft = torch.nn.functional.softmax(input).cpu()   
        Loss=0.0
        t1=target.data.view(1,target.size(0)).cpu()
        for i in range(0,target.size(0)-1):
            t2=t1[0,i]
            for j in range(1,t2+1):
                P1=Psoft[i,:j]
                y1=y[i,:j]
                Loss += sum(P1-y1)**2
        Loss=Loss/target.size(0)
        return Loss  

and there’ll be an error
The type of “Loss” is float.It doesn’t have gradient
What should i do?

If t2 contains a single integer value that you want to use as the loop boundary, you can use t2.item() to get a python number from the content of the tensor. For loop boundaries, you might need to do int(t2.item()).

Is item() a function of 0.4.0 version?

Yes it has been added in 0.4.0