If I change the activation function of VGG16 to switch, ReLU6, LeakyReLU, Mesh, etc., will the accuracy improve? Or does it go down?
I’m not aware of any research comparing a pretrained vanilla VGG16 with different activations functions. It’s an interesting question so let us know if you’ve run some experiments and have some results.
1 Like