How to calculate entropy of each class to measure model uncertainty

I am trying to calculate Entropy to measure model’s uncertainty using MC Dropout for image classification task ,I have calculated the Entropy for each sample using the samples mean output_mean as shown in the code below

for images,labels in testloader:
        images =
        labels =
        with torch.no_grad():
          output_list = []                               
          for i in range(T):
            output_list.append(torch.unsqueeze(model(images), 0)) 
        #calculating samples mean
        output_mean =, 0).mean(0) #shape (n_samples, n_classes)     
        output_mean = np.asarray(output_mean.cpu())
        epsilon = sys.float_info.min
        # Calculating entropy across multiple MCD forward passes 
        entropy = -np.sum(output_mean*np.log(output_mean + epsilon), axis=-1) #shape (n_samples, n_classes)

After calculating Entropy of each sample, I am trying to calculate Entropy for each class to get the model uncertainty about each one of them. Can anyone help me to get the right formula to calculate entropy