Autograd with custom input

I am trying to calculate grads with a custom list of inputs.
Here is the code snippet:

    etas = []
    for param in cv_params:
        eta = 2 * F.sigmoid(Variable(torch.zeros(1)))
        param.grad = param.grad * eta
        etas.append(eta)

    flat_params = []
    for param in cv_params:
        flat_params.append(param.grad.view(-1))
    flat_params = torch.cat(flat_params, 0)

    var_loss = (flat_params**2).mean()
    
    var_grads = torch.autograd.grad(outputs=var_loss, inputs=etas, create_graph=True)
    torch.autograd.backward(etas, var_grads)

but it gives me this error:
RuntimeError: One of the differentiated Variables does not require grad

Even though the inputs are differentiable parameters present in the computation graph of var_loss.

Solved! I just needed to do
eta = 2 * F.sigmoid(Variable(torch.zeros(1),requires_grad=True))