Hi all,
I’m new to pytorch, so this may be a trivial question, but still. I’m trying to create one set of parameters, and “duplicate” it multiple times, like this:
self.vb = nn.Parameter(torch.FloatTensor(64))
self.var_bias = torch.cat([self.vb]*10)
I’m then only using self.var_bias throughout my model. I was thinking of it as a fixed computation graph, meaning self.var_bias is the concatenation of 10 identical copies of self.vb . I expected (self.var_bias[:64] == self.vb) to be always True. However, if I put a breakpoint after a couple of hundred of iterations, I see thats not the case. What am I missing here? did torch.cat create a new Variable entirely detached from self.vb ?
Thanks.