Prepare DataFrame for the dataset and dataloder

hello every body.
my data is pd.DataFrame :

                   time                                 open      high         low       close 

|—|—|—|—|—|—|—|—|—|
|0| 2021-08-05 12:03:00| 109.587|109.594|109.585|109.593|
|1| 2021-08-05 12:04:00| 109.593|109.595|109.580|109.582|
|2| 2021-08-05 12:05:00| 109.581|109.591|109.572|109.585|
|3| 2021-08-05 12:06:00| 109.585|109.590|109.585|109.588|
|4| 2021-08-05 12:07:00| 109.589|109.607|109.589|109.600|
|…|…|…|…|…|…|…|…|…|
|9995| 2021-08-16 10:55:00| 109.369|109.379|109.364|109.378|
|9996| 2021-08-16 10:56:00| 109.377|109.392|109.377|109.386|
|9997| 2021-08-16 10:57:00| 109.385|109.398|109.384|109.398|
|9998| 2021-08-16 10:58:00| 109.397|109.422|109.397|109.416|
|9999| 2021-08-16 10:59:00| 109.415|109.419|109.408|109.416|

rates_frame[‘close’] for dataset and dataloder

#DATA
data_ld = torch.FloatTensor(rates_frame[‘close’].values[0:80000])
data_va = torch.FloatTensor(rates_frame[‘close’].values[80000:])

MNIST dataset

train_tensor_dl = data_utils.TensorDataset(data_ld)
train_tensor_vl = data_utils.TensorDataset(data_va)

Data loader

train_loader = torch.utils.data.DataLoader(dataset=train_tensor_dl,
batch_size=batch_size,
shuffle=True,
num_workers=0)

test_loader = torch.utils.data.DataLoader(dataset=train_tensor_vl,
batch_size=batch_size,
shuffle=False,
num_workers=0)

Is this true?
How can I prepare rates_frame[‘close’] for the dataset and dataloder?

Assuming rates_frame['close'] returns a numpy array, you could use:

data = torch.from_numpy(rates_frame['close'])

and .clone() it if needed.
Besides that I would assume your approach should work.

PS: you can post code snippets by wrapping them into three backticks, which makes debugging easier :wink:

This tutorial might be a good starter.

Unfortunately, I could not find a good training for Deep Learning in the financial markets (PyTorch), but Keras has a lot of good training and moved to Keras