Optimizing one of the model output w.r.t input

I have a model with following definition:

class model(nn.Module):
def init(self):
super(model, self).init()
self.linear1 = nn.Linear(2, 512)
self.linear2 = nn.Linear(512, 64)
self.linear3 = nn.Linear(64, 1)
self.linear4 = nn.Linear(512, 1)

def forward(self, x):
x = self.linear1(x)
y = x
x = self.linear4(x)

y = self.linear2(y)
y = self.linear3(y)

return x, y

After optimizing the model on a particular dataset, I want to find the input tensor that gives the maximum value of y. The summary of the problem is as follows:
i) I have optimized the model on a dataset.
ii) Now, I want to find out what input to the model gives maximum value of y. Here I do not care about the value of x that is being returned by the model.

I know that I have to use gradient ascent for this approach, but I have trouble figuring out how to find the gradient of y w.r.t. the input while still keeping the optimized parameter values of the model.