Training in parallel: same model, multiple data groups

I’m trying to train a model with multiple data groups to get seperate training results. Since I only have one GPU, I wonder is there a way to do so in parallel? Maybe add one more dimension to the input tensor?