I have input - torch.Size([1, 1, 50000, 9])

single line example - 1;0;9512;7507;5508;122;20;0;20

(input values are different, ranging from zero to 10,000)

How do I normalize such data at the network input?

I have input - torch.Size([1, 1, 50000, 9])

single line example - 1;0;9512;7507;5508;122;20;0;20

(input values are different, ranging from zero to 10,000)

How do I normalize such data at the network input?

I had this idea:

convert large numbers to two numbers. For example 7625.5 -> 7.6; 25.5

Here’s how to do it, I do not know. Do you have any ideas?

I assume the input does somehow consist of a number of samples (50000?) and some features (9?).

If so, you could just calculate the mean and std for each feature and standardize it:

```
z = (x - mean) / std
```

I indicated the number 50,000 as an example. There is much more data. I will use a sliding window for data passing data.

If we take the mean and std of all the data, then x will change very little on some data (columns). There is such data where your option will do.