Why is requires_grad==False after multiplication?

How is it possible that the following lines of code:

print(X.requires_grad, W.requires_grad)
K = X * W

yield this output:

False True

? In my understanding K.requires_grad should be True, regardless of anything else in the code.

Thanks in advance

What version of python and pytorch are you on?

Python 3.8.10

I tried the same on my system and it did come True.

Python 3.8.0 (default, Nov  6 2019, 21:49:08) 
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.__version__
>>> x = torch.rand(10, requires_grad=False)
>>> w = torch.rand(10, requires_grad=True)
>>> print(x.requires_grad, w.requires_grad)
False True
>>> k = x * w
>>> k.requires_grad

Could it be possible that something is changing the tensor k?

that is what I’m wondering as well but I have no idea how, since I print K.requires_grad right after I initialized it as X*W

Can you send a minimal, reproducible example of your code?

it wouldn’t be too easy, since this part is called with several function calls, but the reason I am confused is I don’t think this should be possible regardless of other parts of the code

Sorry, without that, it would be really hard for me to debug.

I found the problem, it would be complicated to explain but I’m passing W to a wrapperfunction to implement my forward and backward pass which messes with the computational graph. I’m still slightly confused by how the 3 lines I sent is possible, but it seems clear how I can get rid of it. Thanks for the help!