Getting buffers have already been freed even though retain_graph is True

I am making my own linear regression model from scratch

Here is the code

w = torch.tensor(-10.0, requires_grad=True)

X = torch.arange(-3, 3.1, 0.1, requires_grad=True).view(-1, 1)
F = -3*X

Y = f + 0.1 * torch.randn(X.size())

def forward(x):
    global w
    y = w*x  
    return y

def mse(yhat, y):
    return torch.mean((y-yhat)**2)

learning_rate = 0.1

for epoch in range(1, 11):
    yhat = forward(x)
    loss = mse(yhat, y)
    loss.backward(retain_graph=True) # error occurs here

The following exception is thrown after a successful iteration of the dataset

RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.


Which version of pytorch are you using? I can run your code without any issue with the latest version (changing X->x, F->f and Y->y)

I am using 1.4.0 of pytorch on google colab

It runs fine for me in colab as well:

Am i missing something?

Can you try to install a nighly version of pytorch?
Also which python version are you using?

I am using python 3 and colab

It’s working when i am removing requires_grad=True from

X  = torch.arange(-3, 3.1, 0.1, requires_grad=True).view(-1, 1)