"Softplus can only be differentiated once"

Hi,

I have stumbled on a runtime error during the backward pass. Pytorch has no problem doing double backward for softmax, but it generates errors for softplus. Would you please also add double backward support for softplus?

Thanks a lot!

import torch
import torch.nn as nn
from torch.autograd import Variable

x = Variable(torch.rand(1, 10), requires_grad=True)
w = Variable(torch.rand(10, 10))
y = torch.mm(x, w)
g = nn.Softplus()
l = torch.sum(g(y))
h = torch.autograd.grad(l, x, create_graph=True)
L = torch.sum(h[0])
L.backward()

Softplus already has support for double backward in the master branch, and it will be present in the next release.
If you need it now, you can compile from master following the instructions in https://github.com/pytorch/pytorch#from-source

1 Like