Why Volatility changed after operation: parameter = parameter - (0.01 * parameter.grad)

Hi All,
I write the following code to update the gradient manually, but I meet the following error
`

RuntimeError: element 0 of variables tuple is volatile

`
and the parameter.volatile is True, but why will this operation change the volatility of parameter?

import torch
from torch.autograd import Variable
torch.manual_seed(1024)

input = Variable(torch.rand(100, 12))
parameter = Variable(torch.rand(100, 12), requires_grad=True, volatile=False)

for i in range(10):
    loss = ((input*parameter).abs()-1).abs().sum()
    print(loss)
    loss.backward()
    parameter = parameter - (0.01 * parameter.grad)

Any suggestion will be helpful.
Thank you!

volatile is a viral flag. if any input is volatile, everything in the graph becomes volatile.

So I think it needs to be revised as follows

parameter = parameter - (0.01 * Variable(parameter.grad.data, volatile=False))

It works, thank you!

But here comes another problem
I run the following code, the first iter works well, but for the second iter there comes out an error showing that the parameter.grad is a None type which means the backward failed.

"""The tirst iter"""

loss = ((input*parameter).abs()-1).abs().sum()
loss.backward()
parameter = parameter - (0.01 * Variable(parameter.grad.data, requires_grad=True, volatile=False))

"""The second iter """
loss = ((input*parameter).abs()-1).abs().sum()
loss.backward()
parameter = parameter - (0.01 * Variable(parameter.grad.data, requires_grad=True, volatile=False))

Any idea about this?
Thank you~

parameter.data.copy_(parameter.data - (0.01 * parameter.grad.data))

1 Like

[quote=“smth, post:5, topic:11639, full:true”]
parameter.data.copy_(parameter.data - (0.01 * parameter.grad.data))
[/quote
It wroks, thank you~