Majority voting

Can I use majority voting with softmax activation function outputs in PyTorch to aggregate predictions from a group of classifiers, like 4 CNN models, by combining their softmax probabilities? Additionally, how would approaches like hard, soft, and weighted voting be applied in this context?

You could do it like this:

import torch
import torch.nn.functional as F

# Suppose you have 4 models and their softmax outputs for a batch of inputs
softmax_outputs = [model(input_data) for model in models]

# Convert softmax outputs to class predictions
class_predictions = [torch.argmax(output, dim=1) for output in softmax_outputs]

# Stack the predictions and compute the mode
stacked_predictions = torch.stack(class_predictions, dim=0)
majority_vote_predictions, _ = torch.mode(stacked_predictions, dim=0)

Nice bit on making things faster with vmap: Model ensembling — PyTorch Tutorials 2.4.0+cu121 documentation

Could you please provide a detailed explanation of the steps involved in applying majority voting? Additionally, could you clarify whether this process is considered hard, soft, or weighted voting?

Majority voting, from my understanding, is hard voting.

For a detailed analysis on voting in general, you could refer to this: https://machinelearningmastery.com/voting-ensembles-with-python/

I can’t find the mathematical equation for each one

https://rasbt.github.io/mlxtend/user_guide/classifier/EnsembleVoteClassifier/