怎么创建一个跟随模型变更设备的张量?How to creat a Tensor with the same device as the model?

怎么在nn.Module里面,创建一个张量,他的设备跟随这一个模型改动?

比如说,我需要一个卷积核K,他参数是多少我给定的,不需要参与训练。但是我forward的时候需要他进行几次卷积。除了这一个K以外还有其他的卷积层。

如果model().cuda(),这个张量也是在GPU上,如果.cpu(),这个张量也在CPU上。

I wanna a Tensor created in nn.Module, it has the same device as the Module.

If model().cuda(), the Tensor is also moved to GPU, and if model.cpu(), it is moved to CPU

1 Like

From Google translate:
Title:

How to create a tensor that follows the model change device?

Post:

How to create a tensor in nn.Module, and his equipment changes with this model?
For example, I need a convolution kernel K, the parameters of which are given by me, and I don’t need to participate in training. But when I forward, I need him to perform several convolutions. In addition to this K, there are other convolutional layers.
If model().cuda(), this tensor is also on the GPU, if .cpu(), this tensor is also on the CPU.

Please use an online translation service before posting the question here, so that all users could help you. :wink:

He just posted a dual-language question.

I don’t think that’s entirely true, as my posted translation contains more details than the initial post, which might be helpful to answer the question. :slight_smile:

@ChenzhouWeiYu to register parameters (such as conv weights) in the forward method of your model, you can either use self.register_parameter or assign the nn.Parameter to an internal attribute via self.param = nn.Parameter(...).