So far I’ve been adding layers to my neural nets by using
nn.Linear, but suppose I wanted to do a forward pass on my layer by doing
nlin would be a nonlinearity, say
nn.softshrink, parametrized by some
lambda, and I wanted to make my
lambda a training parameter.
How would I do this? Do I have to write my own version of
softshrink? If so what will I need to do in order to play nice with the other
nn modules and get autodifferentiation to work properly?