Hi,

I am dealing with the problem of “Trying to backward through the graph a second time.”

I think the following two codes are the same, but actually they are not. What’s the difference?

code 1

```
import torch
x = torch.tensor(0.1)
w = torch.tensor(0.1,requires_grad = True)
y=x*w
z=y*w
y.backward()
z.backward()
#c = y+z
#c.backward()
```

code2

```
import torch
x = torch.tensor(0.1)
w = torch.tensor(0.1,requires_grad = True)
y=x*w
z=y*w
#y.backward()
#z.backward()
c = y+z
c.backward()
```

What is the reason that the first writing style will cause “Trying to backward through the graph a second time.” ?

Thanks.