Loss didn't decrease

my net : a stacked 5 layer lstm, a Linear,paras like below:
num_layers = 5
bidirectional = 0 # 0 代表不需要双向,1 代表需要
batch_size = 64
input_size = 4
seq_len = 11
hidden_size = 20
output_size = 400

train result:
0 loss = tensor(6.7650)
50 loss = tensor(5.8131)
100 loss = tensor(4.6847)
150 loss = tensor(4.8746)
200 loss = tensor(4.6258)
250 loss = tensor(4.7645)
300 loss = tensor(4.6798)
350 loss = tensor(4.4751)
400 loss = tensor(4.6369)
450 loss = tensor(4.5987)
500 loss = tensor(4.6832)
550 loss = tensor(4.8298)
600 loss = tensor(4.5591)
650 loss = tensor(4.7101)
700 loss = tensor(4.7349)
750 loss = tensor(4.6433)
800 loss = tensor(4.6694)
850 loss = tensor(4.4634)
900 loss = tensor(4.8253)
950 loss = tensor(4.7901)
1000 loss = tensor(4.7092)
1050 loss = tensor(4.9188)
1100 loss = tensor(4.7793)
1150 loss = tensor(4.9890)
1200 loss = tensor(4.5564)
1250 loss = tensor(4.5261)
1300 loss = tensor(4.8063)
1350 loss = tensor(4.5315)
1400 loss = tensor(4.4564)
1450 loss = tensor(4.8649)
1500 loss = tensor(4.8288)
1550 loss = tensor(4.6560)
1650 loss = tensor(4.7287)
1700 loss = tensor(5.1292)
1750 loss = tensor(4.6975)
1800 loss = tensor(4.9339)
1850 loss = tensor(4.9230)
1900 loss = tensor(4.8357)
1950 loss = tensor(4.7121)
2000 loss = tensor(4.8539)
2050 loss = tensor(4.8245)
2100 loss = tensor(4.7509)
2150 loss = tensor(4.8134)
2200 loss = tensor(5.0023)
2250 loss = tensor(5.0936)
……

Why the loss is not decreasing, any Suggestion ?