What does nn.CrossEntropyLoss(10) mean?
Could I just use nn.CrossEntropyLoss()?
What does nn.CrossEntropyLoss(10) mean?
Could I just use nn.CrossEntropyLoss()?
Using nn.CrossEntropyLoss() is to create a object of CrossEntropyLoss, you could only pass the parameter once when initialized it. If you wanna dynamic adjust the weight, you can use torch.nn.functional.cross_entropy() method while training