Looking at the pytorch data loader docs, one can specify a custom sampler. In my current code, I sampled the indexes as follows:
batch_index = np.random.choice(batch_order, batch_size, p=seq_sample_probs).tolist() batch = torch.tensor(data.x_train[batch_index], dtype=torch_dtype, device=torch_device, requires_grad=False)
I am moving everything to more modern pytorch and I have a
Dataset object now. I wonder how I can now connect such a sampler in the corresponding data loader?