'int' object has no attribute 'backward' in tutorial

I hope this is correct place to ask a very beginner question.
I’m trying to create some names with this tutorial:
https://pytorch.org/tutorials/intermediate/char_rnn_generation_tutorial.html

When I run code copy pasted from it, I get an error:

Traceback (most recent call last):
  File "/home/sivarion/PycharmProjects/pythonProject/tut.py", line 167, in <module>
    output, loss = train(*randomTrainingExample())
  File "/home/sivarion/PycharmProjects/pythonProject/tut.py", line 150, in train
    loss.backward()
AttributeError: 'int' object has no attribute 'backward'

It looks correct, as in train() function loss is declared as:

loss = 0

It is indeed an integer and it is hard to expect for it to have .backward() method.
But in tutorial it states explicit, that:

The magic of autograd allows you to simply sum these losses at each step and call backward at the end.

I’m not sure what should I do here and how to correctly define loss in this case.
Thank you in advance! :slight_smile:

Hi,

To utilize the .backward() method, you will need to have your loss be PyTorch Tensor. It is possible that the for loop was skipped (for i in range(input_line_tensor.size(0)): in the tutorial that you shared) which didn’t update loss to be a PyTorch Tensor object.

Can you check what is the value for input_line_tensor.size(0)?

OK, so as usual, the problem was between the keyboard and the back of the chair :wink:
I added a file with my own names to the supplied datasets - it seemed identical to the others and was loaded correctly, so I thought it wouldn’t cause problems.

After your comment, I decided to check if and how many times the loop is called. It turns out that it is called correctly until it hits the name from my file. After removing it, everything works flawlessly :slight_smile:

Now I still need to figure out how the data is different and correct it accordingly, but I suppose that’s no longer a PyTorch-related problem. Thanks for your support!

1 Like

Awesome, happy to help!

1 Like