My loss is showing random behavior

for steps,(x,y) in enumerate(loader):

    count=count+1

    if net.train(mode=True):
        logits= net(x.float())
        target=y.long()
        loss=criterion(logits,target)
        loss.backward()
        optimizer.step()
        optimizer.zero_grad()
    else:
        logits= net(x.float())
        target=y.long()
        loss=criterion(logits,target)

I have not used softmax as I am calling cross entropy.
Forward Propagation:
x=self.act(self.layer1(x))
x=self.layer2(x)
Here is my output:
tensor(1.8975, grad_fn=) Loss of Training Data
tensor(1.9753, grad_fn=) loss of test set
tensor(1.7179, grad_fn=) Loss of Training Data
tensor(1.5440, grad_fn=) loss of test set
tensor(1.6857, grad_fn=) Loss of Training Data
tensor(1.4142, grad_fn=) loss of test set
tensor(1.3730, grad_fn=) Loss of Training Data
tensor(2.0773, grad_fn=) loss of test set
tensor(1.5744, grad_fn=) Loss of Training Data
tensor(1.9398, grad_fn=) loss of test set
tensor(1.3640, grad_fn=) Loss of Training Data
tensor(1.0328, grad_fn=) loss of test set
tensor(1.3167, grad_fn=) Loss of Training Data
tensor(1.5097, grad_fn=) loss of test set
tensor(1.6103, grad_fn=) Loss of Training Data
tensor(1.2433, grad_fn=) loss of test set
tensor(1.7091, grad_fn=) Loss of Training Data
tensor(1.3714, grad_fn=) loss of test set
tensor(1.2498, grad_fn=) Loss of Training Data
tensor(1.6427, grad_fn=) loss of test set
tensor(1.8755, grad_fn=) Loss of Training Data
tensor(1.3751, grad_fn=) loss of test set
tensor(1.2940, grad_fn=) Loss of Training Data
tensor(1.6323, grad_fn=) loss of test set
tensor(1.4709, grad_fn=) Loss of Training Data
tensor(1.2648, grad_fn=) loss of test set
tensor(1.3212, grad_fn=) Loss of Training Data
tensor(1.7257, grad_fn=) loss of test set
tensor(1.4292, grad_fn=) Loss of Training Data
tensor(1.9744, grad_fn=) loss of test set
tensor(1.4333, grad_fn=) Loss of Training Data
tensor(1.4438, grad_fn=) loss of test set
tensor(1.3763, grad_fn=) Loss of Training Data
tensor(1.5208, grad_fn=) loss of test set
tensor(1.4129, grad_fn=) Loss of Training Data
tensor(1.6961, grad_fn=) loss of test set
tensor(1.2673, grad_fn=) Loss of Training Data
tensor(1.3677, grad_fn=) loss of test set
tensor(1.4119, grad_fn=) Loss of Training Data
tensor(1.5740, grad_fn=) loss of test set