How to make model parameters as a function of a tensor?

I have a simple neural net.

class my_nn(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc = nn.Linear(4, 5)
        self.w = torch.Tensor(25)
    
    def forward(self, x):
        return torch.tanh(self.fc(x))

And I want parameters of my_nn object (weights and a bias) to be a function of self.w (say, the Tensor of flattened and concatenated parameters is equal to self.w ** 2). While backpropogating I would like to get the gradient of the net with respect to self.w, not my_nn parameters. How can one do it?