Training parameters in nonlinearities

So far I’ve been adding layers to my neural nets by using nn.Linear, but suppose I wanted to do a forward pass on my layer by doing self.nlin(self.Psi(x), where nlin would be a nonlinearity, say nn.softshrink, parametrized by some lambda, and I wanted to make my lambda a training parameter.

How would I do this? Do I have to write my own version of softshrink? If so what will I need to do in order to play nice with the other nn modules and get autodifferentiation to work properly?

It all depends how do you want to change it, there’s not a single good recipe. You should be able to implement it as a Python function that operates on Variables, and that will get the gradient computed automatically. Another approach would be to implement your own function. You can find some notes on that in the docs.