Customize an activation function

I’m not sure, if the nested loop is needed, since the parameters only contain one element and could maybe be applied directly yo the input tensor.
However, I would recommend to use some defined input data, calculate the backward pass, and check the gradient for the expected values.

Many thanks for your reply.
Would you please tell me how I can write it without loop? to apply directly on the input?

HI Ptrblck,

Sorry I need to customize the Tanh function in the way that it saturates in the specific number.
I did not find any thing for Tanh. I found for sigmoid with learnable parameters.

tanh is just a rescaled version of the logistic sigmoid function as described here by @rasbt.
You could reuse the sigmoid function with learnable parameters, which you have already found, or adapt it to fit your use case.

1 Like

HI Ptrblck,

I am very two minded and need your idea. I am using nn.LeakyReLU(0.2, inplace=True)), for my discriminator with CNN layers.

I’m not sure should I make the inplace False or keep it True in my GAN. I read some comments but were not clear.

Many thanks

If no error is raised, you could set it to inplace=True and could save a bit of memory.
On the other hand, if an error is raised or you are planning to script the model, you could leave it as inplace=False.

1 Like

but it has no effect on the final result? be False or True?

That is correct. It won’t change any results and is only a potential optimization, which could save memory.

1 Like

@Tudor_Berariu Hi!
can you tell me why it is okay not to write about backpropagation?