Difference between using a sampler and a batch sampler

Hi, I’m confused about the usage of Sampler and Batch Sampler since they’re both possible arguments when instantiating a Dataloader object. I would like to use a random subset of samples from my dataset during training. There are two ways I could do this:

  1. Using RandomSampler: I could create a sampler object using RandomSampler and provide a certain number of samples I would want without replacement. Then, I could pass this sampler object to the dataloader. Additionally, let’s say (as an example) I choose a batch size of 4 in my dataloader.

  2. Using BatchSampler: Here, I could pass the RandomSampler as input to the batch sampler, and pass the batch size of 4 to the batch sampler.

Is there a difference between the two methods I described above and is there some performance benefit?

Thanks.