How do I replace every ReLU in a Torchvision model with a different activation function?

As the title says: How do I replace every ReLU in a Torchvision model with a different activation function?

Say, I wanted to replace each ReLU with an ELU.

It depends which model you would like to use.
While some torchvision.models use an attribute for the non-linearity, such as self.act, which can be easily replaced by a new assignment, others use the functional API, so that you would need to override the forward method.

Thanks, that makes sense! Could you tell me which models use self.act and show me a snippet of how to change it?

The ResNet implementations use self.relu here and you could replace it via:

model = models.resnet18()
model.relu = nn.SELU()