two variables, one's requires_grad= True, another's requires_grad = False

If I use to stack two variables, one’s requires_grad =True, another’s requires_grad=False, will these properties retain during .backward() and .step()

It seems the answer is YES after I experimented a little.

Yes, in general, if any input requires grad, so will the output.

1 Like

Hi, did you also use default_collate function of torch in dataloader on stacked tensors with different requires_grad values? I am hitting a problem with creating batch on this kind of data?