How do I find the right dimension for sensor data for neural network inputs?

Hey, I’m trying to solve a classification problem with a neural network based on time series sensor data. Basically I have 2 types of sensor data: a and b. Each sample consists of 8 time points, basically:
SAMPLE:
------Sensor 1, Sensor 2
time1: 0.10 , 20
time1: 0.20 , 20
time1: 0.30 , 20
time1: 0.40 , 20
time1: 0.50 , 20
time1: 0.60 , 20
time1: 0.70 , 20
time1: 0.80 , 20

To what shape do I have to form the data for a linear layer (8x2,16x1)? It’s basically a 2d Input, but I’m unsure because its a time series

I tried shaping it to [[0.10 , 20],[0.20 , 20],[0.30 , 20],[0.40 , 20]…] but my loss calculation wants a 2D label for that. I also tried [ 0.1000, 0.2000, 0.3000, 0.4000, 0.5000, 0.6000, 0.7000, 0.8000, 20.0000, 20.0000, 20.0000, 20.0000, 20.0000, 20.0000, 20.0000, 20.0000], but the results are bad.