LSTM Time series prediction delayed (and other problems)

Currently, I am doing a project on using a LSTM model to predict the price of cryptocurrencies for trading.
After reading many of the online tutorials and looking through the forum as well, I couldn’t figure out how to produce better results. I am almost certain that I have made mistakes somewhere, but after referencing other people’s code and increasing / decreasing hidden size, layer count, tried to give the model more data / less data, normalized the data before splitting, splitting the data before normalizing, rewriting code for almost two months with no success, I have no clue on what to do next to do.

The latest copy of my code is in

Right now, I have identified a few major problems with my code.

  1. No matter how far advanced I set my prediction, the LSTM outputs a similar curve delayed by the same amount. I have read that this is because the model decides to just copy the data over, but I can’t find a way to solve this.
  2. If I increase the seq_len more, GPU memory runs out. I did try to use DataLoaders, but those seemed to be much slower and produce worse results for me than just inputting data directly. Is it because of my implementation of the code?

Please help, and if there are any other resources I should look at please let me know as well, thanks a lot.