Does pinning memory works when there are random augmentations?

When Dataset has transforms like torchvision.transforms.RandomErasing, the resulted data tensor will have randomly modified value. In this case, does pin_momory in DataLoader class have any practical advantages?

As I understand it, pinned memory is used to make host-to-device memory transfers (e.g., CPU to GPU) more efficient, so it is orthogonal to an augmentation such as RandomErasing.

@eqy ok. I thought that the contents of memory matter when allocating page-locked memory… because the contents are consumed only single time.