Parameter Tensor grad func conflict

I was wondering what happens if you have a parameter (P), that has required_grad to be set to true, and manually set its associated tensor, P.data, to requires_grad False. So P = parameter with requires_grad:True, but P.data.requires_grad = False. What happens if I back-propagated on a Neural Network that had this phenomenon.

Also i noticed that when i printed out the .data value of one of the parameters in my nn.Module, its requires grad property is set to False. Does this mean that requires_grad of .data is not used if it is embedded in a Parameter whose requires_grad property is True?

The .data attribute is used internally and only its parent would reflect the .requires_grad attribute.
Note that the usage of .data is deprecated and could yield to many side effects, as Autograd won’t be able to track operation performed on .data.