Unexpected cuda behaviour when registering buffers

Hello all,
I’ve encountered an issue while trying to register buffers from another module to a custom made module while doing the same thing with parameters (registering parameters) the method behaved as expected.

Here is a smaple of my code

import torch
import torchvision.models

class MyModule(torch.nn.Module):
    def __init__(self):
        super(MyModule, self).__init__()

resnet18_model = torchvision.models.resnet18()
my_model = MyModule()

print("Checking Buffers")

for buffer_name, buffer in list(resnet18_model.named_buffers()):
    my_model.register_buffer(buffer_name.replace('.', '_'), buffer)
for buffer in list(my_model.buffers()):

print("Checking Parameters")

for param_name, param in list(resnet18_model.named_parameters()):
    my_model.register_parameter(param_name.replace('.', '_'), param)
for param in list(my_model.parameters()):

My expected behaviour of this code was as follows :
after loading the resnet18 module into cuda, and after registering its parameters and buffers into my module I expected both of them to be in cuda, as you can see i’ve checked if is_coda flag is on and my results where as follows:

Checking Buffers
Checking Parameters

it seems as if the register_buffer uses some sort of “copying” the input buffers and not copying its address (reference) to the memory, while register_parameters is acting as expected.

Is it a bug? or i’m missing something?
Is there a different way of registering the buffers pointers into other modules?