Solution for a Torch (lua code) syntax in Pytorch

I have a piece of code that is written in LUA and I want to know what is the pytorch equivalent of the code.?How do I implement these lines in pytorch … Can somebody help me with it? The code is mentioned in the comment

real_ctx[{{},{1},{},{}}][mask_global] = 2 * 117.0/255.0 - 1.0
real_ctx[{{},{2},{},{}}][mask_global] = 2 * 104.0/255.0 - 1.0
real_ctx[{{},{3},{},{}}][mask_global] = 2 * 123.0/255.0 - 1.0
input_ctx:copy(real_ctx)

This is in reference to “Context Encoders: Feature Learning by Inpainting” authored by Deepak Pathak

self.mask_global = torch.ByteTensor(1, 1, 256, 256)

self.mask_global.zero_()

real_A = torch.Tensor(16, 3, 256, 256)
real_A.narrow(1,0,1).masked_fill_(self.mask_global, 2*117.0/255.0 - 1.0)
real_A.narrow(1,1,1).masked_fill_(self.mask_global, 2*104.0/255.0 - 1.0)
real_A.narrow(1,2,1).masked_fill_(self.mask_global, 2*117.0/255.0 - 1.0)

Thanks a lot @Naruto-Sasuke !

@Naruto-Sasuke Have you implemented this paper (Context Encoders: Feature Learning by Inpainting) in pytorch?