Torch.stack yielding wrong dimensions

I am quite new to pytorch, and I am trying to create data within a dataloader and my code looks like so:

a=[]
.. within a for loop
self.a.append(torch.stack([b[ith_idx][j], \
b[ith_idx][rnd_dist], \
b[rnd_cls_idx][rnd_dist_rnd_cls]]\
))
self.c.append([1,0])

where, b is a python list of tensors. For example, the first element of b has shape torch.Size([46, 3, 512, 512]) .

        self.a = torch.stack(self.a)
        self.c = torch.tensor(self.c)

and I notice I have shapes of [500,3,3,512,512] and [500,2] for a and b , while I was expecting 500,3,3,512,512 and 500 as tensor shapes.

Any pointers as to why this is happening would be helpful.

Since you are passing two values to self.c, your expected shape would be [N, 2].
Could you explain a bit what your use case is and how self.c should be stacked to create a [500] tensor?

@ptrblck: Actually, your explanation makes sense, it does indeed seem correct. Thank you for your time!