import torch

x = torch.randn([100, 3, 28, 28])
device = torch.device("cuda:0")
x = x.to(device)

I wanna trace the procedure tensor.to(device),research the function calls and how the cuda memory is allocated and tensor moved. But there are so many python tricks within the torch source code. I wanna ask where the Tensor.to() is defined?Thanks

The .to() operation should eventually dispatch to any of the r.copy_(self, non_blocking) calls in _to_copy and then into copy_kernel_cuda.

1 Like

Thanks for your help.