I’m only in the first epoch and it’s already very erratic. It stays the same no matter how many epochs I try. There is a general trend downwards, but changing the learning rate seems to have no effect. I’ve tired learning rates from 0.001 to 0.0003.
My GNN has approximately 5M parameters and is training on around 8M samples of graph data (node features and edge indexes). Values are also normalized, so it can’t be that. I’m using the Adam optimizer.