Why my loss doesn't decrease when training

When i was training my model, I found the loss doesn’t decrease, My model input is 3-D image, and the loss function is binary cross entropy. The optimizer is SGD with momentum 0.9, initial learning rate 0.01, batch size 256.

the loss in the 30 epochs is
0.749619674
0.693235771, 0.69320591, 0.693198329, 0.693196385, 0.693205666, 0.693200624
0.693216185, 0.69323178, 0.693249031, 0.69317602, 0.693203645, 0.69319184
0.693239689, 0.693188139, 0.693181048, 0.693167897, 0.693213663, 0.693183094
0.693211535, 0.693212263, 0.69321184, 0.693209582, 0.693231421, 0.693211854
0.693185517, 0.693207201, 0.693201362, 0.693189077, 0.693243751

Anyone can give me a suggestion? Thanks!

Try to play around with some hyperparameters, e.g. lower the learning rate.
If that doesn’t help, try to scale down your use case (e.g. just use a tiny subset of your data) and check if your model is able to overfit this dataset.