How can I see if pytorch using rdma

Hi,
I am using distributed pytorch based on nccl, the code looks like the fllowing:

dist.init_process_group(backend='nccl', init_method=args.dist_url,
                                world_size=args.world_size, rank=args.rank)
...
model.cuda()
model = torch.nn.parallel.DistributedDataParallel(model)
...

How can I see if program used rdma?

Thanks a lot !