Backprop through Jacobian

I would like to construct my loss function so that it is a function of the jacobian as follows.

model = nn.Sequential(
nn.Linear(1, 1),
nn.ReLU())
opt = torch.optim.SGD(model.parameters(), lr = 0.001)
z = torch.tensor([2.])
det = torch.autograd.functional.jacobian(model, z)
loss = (torch.abs(det))**2
loss.backward()

I would appreciate any suggestions on how I could get this to work. Thanks

Hi,

You just need to pass the create_graph=True flag to the call to jacobian and it will work.

1 Like