Hi all! I’m new to this forum, but I’ve been having trouble for quite a few days trying to get a vanilla UNet model to overfit on a single image in training and haven’t ever run into this kind of issue before, so I thought I’d shout into the depths of the internet in case others have some insight into this.
The UNet is being trained for 2D image segmentation with ground truth masks (1 class). I am currently training on a single image to debug the network.
I have confirmed that my gradients and weights don’t explode (there is gradient clipping), the image normalization is correct (I normalize pixels from 0-255 to 0-1 and then normalize with dataset mean and stddev), the mask values are correct and are only in {0, 1}, the shapes of the output and ground truth masks are the same, and the loss is BCE with logits. I’ve also tried learning rates from 1e-9 to 0.1 just in case I had missed some level of granularity. I also have tried kaiming, xavier, and normal layer initializations (and the default PyTorch initialization).
The model is the one from this repository: https://github.com/milesial/Pytorch-UNet/tree/master/unet
The training procedure is also the one found here: https://github.com/milesial/Pytorch-UNet/blob/master/train.py
The model trains very erratically and with high variance from train to train, and within the same training/epoch, the losses decrease for a couple iterations and then jump all over and end up increasing steadily after a few epochs (resulting in around a 0.7 loss value plateau), even though the gradients don’t explode. The output predictions are also rather dismal/unmeaningful.
I’ve also tried training a completely different model that was pretrained on a separate, related dataset, but using the same training and dataset code. Unfortunately, the behavior is exactly the same; I wonder if there’s an error in the training or dataset code, but I can’t seem to find one and I’ve tried debugging pretty much everything I could think of.
Would anyone be able to provide some guidance on this? Thanks so much!