Loss stays constant during the training


I am trying do to a simple model,but unfortunately the loss stays constant.
Here is a part of my code(which I think that contains the bug):

def transform(x):
    x = x.view(x.shape[0], 81, 9)
    x = torch.argmax(x, dim=2).double()

    return x

def train():

#some other code

optim = torch.optim.SGD(self.parameters(), lr=0.1)
loss_fun = nn.MSELoss()
for e in range(epochs):
            #passing the data through my only layer
            output = self(train_data)

            #converting the labels to double(I get an error if I delete this)
            train_labels = train_labels.double()
            #apply some changes on my output
            loss = loss_fun(transform(output),train_labels)
            #append the loss to this array so I can plot it

list(self.parameters()) returns some tensors with some random data(which I think is good)
If I iterate over my parameters and call .grad I get only None always,so I think that I am damaging in some way some other grads(maybe deleting the ones already calculated?).Thank you!