Using PyTorch DataLoader for data loading and TF as the training backend

Is it possible to use PyTorch DataLoader for data loading and TF as the training backend?

This should theoretically be possible unless the multiprocessing usage in the DataLoader conflicts somehow with TF (I would assume it should show a warning).