Multivariate LSTM classification

Hi all,

I am trying out multivariate LSTM for classification problem, starting with a simple custom dataset as follows:

for i in range(2000):
    seq = random.sample(range(0,100), 30)
    seq = np.array(seq).reshape(1,-1)
    if i == 0:
        data = pd.DataFrame(seq)
    else:
        data = pd.concat((data, pd.DataFrame(seq)), axis = 0)

which is essentially a dataset with 2000 samples and 30 “features” and the custom target of this dataset is as follow:

sum_ = data.sum(axis=1)
reduced = sum_ - sum_.shift(periods = 5)

The target will be 1 if the variable reduced is positive and 0 if negative.

Trial was made with a LSTM with 2 layers with size of 256. 5 epoches into training the BCE loss do not seem to drop but instead fluctuate a lot as follow:losses_4

The loss was initially around 0.6 but then fluctuating among 2.5 to 1e-3.

The adopted optimizer is Adam with one cycle lr (max lr is 1e-3).

Is there any reason that may case such result? or the custom dataset is still too complicated?

Thanks in advance for any advice!

Merry Christmas to you all~