Are torch.transform done in parralel?


First off, sorry if my question is probably more related to Python itself rather than PyTorch.
I am unclear if torch.transform pipelines created with torch.compose are done in parrallel over the batch.

If I have a batch of 2 samples (fetched with a DataLoader, num_worker = 0).
And a torch.compose having two transformations A & B.

What really happens ?

(A&B over sample 1) then (A&B over sample 2).
(A over sample 1) then (A over sample 2 & B over sample 1) then ( B over sample 1)

I have the same question regarding a forward pass in any sequential model.
Are samples passed one by one over the entire network, or are they sent in a queue layer by layer ?

Each sample will be loaded and processed in the Dataset.__getitem__.
The common approach is to load and transform a single sample so the transformations would be performed sequentially in the main processes given you are using num_workers=0.
torchvision.transforms also accept tensors and you could use a BatchSampler if you want to load and process multiple tensors at once.
If you use num_workers>0 each process will process and create its own batch.

Samples are passed as a batch to the model and all layers accept batched inputs thus processing all samples at once.