How to prevent a variable from automatic differentiation?

Hi,
In my implementation, I have following piece

dec = decoder_output.clone().squeeze()
for i in self.state_history:
    dec[i] = -100000

topv, topi = dec.topk(1)
action = topi.squeeze().detach() 

decoder_output is output of F.log_softmax
My question is, is dec recorded for automatic differentiation and does it affect the differentiation for decoder_output and rest of the graph? What is the best way to prevent it from automatic differentiation?
Also, is accessing dec[i] inside a for loop possible if dec is a torch.cuda tensor?

The assignment should be recorded and the inplace operation could yield an error.
You could detach dec from decoder_output, if you don’t want to track this assignment.

Yes, that’s possible.