What's the param `size_average` in Loss? It doesn't work as expected

Hi all,

I’m using Pytorch to build my own model but meet some problem.
I found many Loss has the param size_average, such as torch.nn.CrossEntropyLoss(weight=None, size_average=True).
Loss func takes Input: (N,C) and Target: (N) then returns a single value, which I suppose is averaged on batch_size N.

size_average (bool, optional): By default, the losses are averaged
                over observations for each minibatch. However, if the field
                sizeAverage is set to False, the losses are instead summed
                for each minibatch.

But when I set size_average=False, it still returns a single value instead of a batch_size loss.
So I wonder what this param size_average do? How can I access losses of batch by individual?
Thanks a lot!

1 Like

Currently, all losses in pytorch returns a single number, which is either the sum of the losses per element, or the average.
Obtaining losses per element is not yet implemented in pytorch, and is being tracked in https://github.com/pytorch/pytorch/issues/264

1 Like