Average different layers in pytorch

How do I perform an average of 2 or more layers i.e. z = keras.layers.average([x, y, w]) as done in Keras. Should I first add them via torch.add and then divide by the number of layers or is there any direct way to perform it?

If you want to calculate the average of the activations of different layers, you could simply add them and divide by the number of activations (3 in your case).
I’m not sure, what “average of layers” means, but if you want to average the parameters of different layers, you could use this approach.

You mean to say that to implement above mentioned the keras average layer in pytorch, it would be z = (x+y+w)/3 ?

If the Keras layer calculates the mean of the activations, then yes, your approach should work.

1 Like