How should I use Scikit-Learn Scalers inside my pytorch dataset

I am having trouble understanding how could I use scikit-learn scalers inside my pytorch dataset.

So, I have a working training loop where for each epoch, we run through all batches in train_loader and then use the feature vector as input to the model (‘Resnet50’) . Followed by loss calculation backward pass and optimizer updates. Pretty standard training biolerplate code.

However now i am interested to pass scaled version of my feature vector. for that I tried introducing MinMaxScaler from sklearn inside my custom dataset. It does transform the data as needed, however i couldnt return the trained MinMaxScaler object as Pytorch Dataset dont allow to return objects.

So now If i wish to perform any kind of scaling , how do i proceed, should i return the feature vector as is for train and validation, and use scaler inside my training loop, or should i somehow try to do it inside dataset class??

Why do you want to return the scaler object? Wouldn’t you apply the scaling in the __getitrm__ as any other transformation and return the transformed tensor?