How to calculate the entropy of an image?(my softmax results doesn't sum up to 1)

hi, I am very new to ML and this is my first post here, sorry if something doesn’t match the standard of the site.
I am trying to calculate the entropy of an image, and before that, I should use the softmax function to get the probabilities, there is no error in this process but the softmax result doesn’t sum up to be 1, it should add up to 1 right?
here is the part of the code

    transform = Compose([ToTensor(), Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225))])
    x = transform(image)
    x = x.unsqueeze_(0).cuda()
    with torch.no_grad():
        y = model(x)
    logits =
    logits = torch.squeeze(logits)
    smax = nn.Softmax(dim=1)
    smax = smax(x)
    print("SOFTMAX SUM:",smax.sum())
    # SOFTMAX SUM: tensor(2073600., device='cuda:0')
    # this is the shapes of the variables: SHAPE X, Y, LOGITS torch.Size([1, 3, 1080, 1920]) 
    #torch.Size([1, 19, 1080, 1920]) torch.Size([19, 1080, 1920])

after this, I should calculate the entropy (E = -sum(p_i * log2(p_i))of the image(or some pieces of it.)
can someone please help? seems like the torch doesn’t have an entropy method.

The output of the nn.Softmax(dim=1) layer will sum to 1 in the specified dimension as seen here:

x = torch.randn(10, 10, 10, 10)
smax = nn.Softmax(dim=1)
out = smax(x)
> tensor(1000.)
print(out.sum(dim=1)) # ones
1 Like

so for calculating the entropy I should get the softmax for every dimension?
and can you help me to calculate the entropy, please?