Retain_graph issue

I experiment with autograd in Pytorch. I try to understand retain_graph option. I prepare the code to make it easier to understand:

import torch
from torch.autograd import Variable
a = torch.tensor(2.0,requires_grad=True)
b=2*a**2
c=5*b
d=torch.sin(c)
e=torch.cos(c)
d.backward()
a.grad.zero_()
e.backward()

This code return error so I need to set retain_graph=True in d.backward() - it’s simple
But when I change b=2a**2 to b=3a the following code doesn’t return the error:

import torch
from torch.autograd import Variable
a = torch.tensor(2.0,requires_grad=True)
b=3*a
c=5*b
d=torch.sin(c)
e=torch.cos(c)
d.backward()
a.grad.zero_()
e.backward()

My question is: why the second example doesn’t return the error? Why power operation requires setting retain_graph option to true, while multiplication doesn’t?