How does dataloader create a tensor batch

It seems you could still return a dict.
At least this small code snippet works in the nightly binary from approx. a week ago:

class MyDataset(Dataset):
    def __init__(self):
        pass
    
    def __getitem__(self, index):
        return {'x': torch.randn(3, 24, 24), 'y': torch.randint(0, 10, (1,))}
    
    def __len__(self):
        return 10

dataset = MyDataset()
loader = DataLoader(dataset, batch_size=5, num_workers=0)

for batch in loader:
    print(batch['x'].shape)
    print(batch['y'].shape)

Are you getting an error and if so, could you post the error message?