Some questions about one-hot

I want to fine tune a classification network.so,I need to make a one-hot label.

label =  torch.zeros(1,74,dtype=torch.long)   //1 batch size   74 categories
label = label.scatter_(dim=1,index=torch.LongTensor([[ (0~73) ]]),value=1) 

print(label)
**torch.Size([1, 74])**

RuntimeError: multi-target not supported at ClassNLLCriterion.c:22

Offical docs of CrossEntropyLoss :
Examples:

loss = nn.CrossEntropyLoss()
input = torch.randn(3, 5, requires_grad=True)
target = torch.empty(3, dtype=torch.long).random_(5)
output = loss(input, target)
output.backward()


print(target.shape)
torch.Size([3])

l convert to torch.Size([74])
ValueError: Expected input batch_size (1) to match target batch_size (74).

my loss:
criterion = torch.nn.CrossEntropyLoss()
loss_contrastive = criterion(net_output,label1)

CrossEntropyLoss does not work with one-hot encoded targets.
The target dimensions are specified as [batch_size]. Internally the indices are stored.

In the example input has a batch size of 3 and provides 5 logits.
The target has the same batch size of 3 with random indices.

1 Like

thanks,l konw this question