How to use autograd to differentiate twice when non scalar?

def physics_loss(pinn_output, mass, forces):
# Create a tensor of time values
time = torch.linspace(0, (len(pinn_output)-1)*0.01, len(pinn_output)).requires_grad_(True)

# Repeat the time tensor to match the dimensions of pinn_output
time = time.unsqueeze(1).repeat(1, pinn_output.shape[1])

# Compute velocity by differentiating position with respect to time
v_pred = torch.autograd.grad(pinn_output, time, grad_outputs=torch.ones_like(pinn_output), create_graph=True, allow_unused=True)[0]

# Compute acceleration by differentiating velocity with respect to time
a_pred = torch.autograd.grad(v_pred, time, grad_outputs=torch.ones_like(v_pred), create_graph=True, allow_unused=True)[0]

# Adjust for gravity in the y-direction
a_pred[:, 1] -= 9.81  # Subtract gravity from y-acceleration

# Compute the physics loss
return torch.mean((forces - mass * a_pred) ** 2)

I have this physics loss equation and I am trying to get v_pred and a_pred but I constantly get errors like “TypeError: ones_like(): argument ‘input’ (position 1) must be Tensor, not NoneType”

Hi @santom,

Can you check the type of pinn_output and v_pred, they seem to be type None instead of a Tensor.

hey!

for the pinn_ouput, this is the type : <class ‘torch.Tensor’>

for v_pred, this is the type: <class ‘NoneType’>

But I don’t understand why the v_pred is nontype. Do you know?

If you set allow_unused=False, does autograd state there are Tensors not used in calculation of the gradient?

Is the time Tensor used in the calculation of pinn_output?