Basic operations on multiple GPU-s

Hy I just switched from tf and Im loving pytorch. Im would like to use parallel GPU computations on basic operation like matmul and torch.randn (im doing evolution strategies) is there any way to implement it in pytorch. Ive only seen examples that involve using the nn.DataParallel wrapper on models.

Could you explain your use case a bit more?

If you have separate computations, you could use each GPU to perform a single op:

res0 = torch.matmul(x.to('cuda:0'), y.to('cuda:0')
res1 = torch.matmul(x.to('cuda:1'), y.to('cuda:1')

Another approach would be to use nn.DataParallel in case you would like to send data chunks to each GPU and perform the same operations.

1 Like

Most of my operations are sequential, what I would like to do is splitting up my arrays by the population dimension along multiple GPU-s. Can I use nn.DataParallel without using a Model class ?

Yes you can :slight_smile: See Uneven GPU utilization during training backpropagation - #14 by colllin for an example wrapping the loss function with DataParallel

Im using evolution methods, I dont have backpropagation and my loss function is not differentiabe. I want to parallelize sequential basic operations like torch.matmul.

then you don’t need to call .backward()

Can you elaborate please ?

Can you elaborate please ?

I linked a post describing how to wrap an arbitrary loss function via DataParallel so that you can compute it via scattering the dataset onto multiple GPUs. Then you responded

Im using evolution methods, I dont have backpropagation and my loss function is not differentiabe. I want to parallelize sequential basic operations like torch.matmul.

and I mentioned that in that case, if you don’t want to use backpropagation, you don’t need to call .backward() on your loss function. In other words, I also wasn’t sure how your answer was related to wrapping a loss function in DataParallel. I mean, you can use it whether or not you want to do backpropagation, because DataParallel is not tied to backpropagation as far as I know.

1 Like