I am trying to compute the gradient of the model output w.r.t the input. The gradient should be calculated on some slices of the inputs and outputs as can be seen in the code below.
Briefly, the model is a time-series model in which the input is of a 3D shape and after passing to the model, the output will be of a 2D shape.
The initial shape of the data is input=(3150, 9) and output=(3150, 8)
for epoch in range(config["num_epochs"]):
model.train()
train_loss = 0.0
for inputs, targets in train_loader:
optimizer.zero_grad()
outputs = model(inputs)
u1 = outputs[:, 0:1]
u2 = outputs[:, 1:2]
u3 = outputs[:, 2:3]
u4 = outputs[:, 3:4]
u5 = outputs[:, 4:5]
u6 = outputs[:, 5:6]
u7 = outputs[:, 6:7]
u8 = outputs[:, 7:8]
last_input = inputs[:, config["delay"], :]
last_input.requires_grad = True
print("u1 requires grad:", u1.requires_grad)
print("is u1 leaf:", u1.is_leaf)
print("last_input requires grad:", last_input.requires_grad)
print("is last_input leaf:", last_input.is_leaf)
################################### output_1 ###############################################
du1 = torch.autograd.grad(u1, last_input, grad_outputs=torch.ones_like(u1), retain_graph=True, create_graph=True, allow_unused=True)[0]
if du1 is None:
print("Gradient is None!")
else:
print("Gradient is not None!")
and After running the code I get
u1 requires grad: True
is u1 leaf: False
last_input requires grad: True
is last_input leaf: True
Gradient is None!
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-43-0115271f0e18> in <module>
235
236 #print(du1.shape, du1.dtype)
--> 237 u1_t = du1[:, 0:1]
238 u1_x1 = du1[:, 1:2]
239 u1_x2 = du1[:, 2:3]
TypeError: 'NoneType' object is not subscriptable
So my question is what I am doing wrong that I can not compute the gradient?