RuntimeError: function SetItemBackward returned a gradient different than None at position 3, but the corresponding forward input was not a Variable

Above is the error I am getting. Unfortunately I am not able to reproduce the error in a small example. A similar issue for ConcatBackward is addressed here https://github.com/pytorch/pytorch/issues/2367 but I am unable to figure out the same for SetItemBackward.

Any help will be appreciated. Thanks

I figured out the error. I had a variable as a dictionary and I was using some elements of the dictionary as inputs to another neural network. I worked around it by having the this new input to require grad as False using .detach() method.