if i put in print(x.shape) in the training loop it returns 20 of these shapes at every epochs:
torch.Size([X, 10, 695])
with X being different in all 20 instances and also changing every epochs.
if i put in print(mask.shape) in the training loop it returns 20 of these shapes at every epochs:
torch.Size([X, 10])
The 10 is the batchsize.
The 695 is the inputdimsize (which I was told are the unique classes in the dataset).
It only needed one numpy to be altered in a torch and it works.
I am sure it is not efficient this way (as it is fenominally slow) BUT it works.
So thank you very much!!