Construct Pytorch dataset from list

Hi everyone,

I recently came across this approach in TensorFlow that creates a TensorFlow dataset from a list and allows you to batch your data. I want to do a similar thing in Pytorch. I have a dataset that does not fit into memory and I wanted to know if there is an equivalent approach in Pytorch dataloaders?

import tensorflow as tf 
dataset = tf.data.Dataset.from_tensor_slices([1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
dataset = dataset.map(lambda x: x*2)
dataset = dataset.batch(64)
dataset = dataset.prefetch(tf.data.experimental.AUTOTUNE)

@ptrblck may you kindly assist

Does

X = torch.Tensor(range(11))
X = torch.square(X)

do what you want?

No, I have added more information to the question.

I don’t know TensorFlow, and so cannot figure out what that code is supposed to do. If you could describe what it is doing in words, I can try to figure out how to do the same in PyTorch.