# Optimizing a function with respect to specific arguments

I’m new to PyTorch so you’ll have to forgive me for this basic question. I couldn’t find an answer on the forums.

Okay, so you have a simple function like this:

``````def square(x):
return x**2
``````

Optimizing it is trivial (I am aware that this is a silly example):

``````params = torch.tensor([4.])

print(square(params))
n_optim_steps = int(1e4)
optimizer = torch.optim.SGD([params], 1e-2)

for ii in range(n_optim_steps):
loss = square(params)

print('Step # {}, loss: {}'.format(ii, loss.item()))
loss.backward()
optimizer.step()
``````

But what if we change the function “square” a bit so that it becomes:

``````def square_x_then_multiply_y(x, y):
return x**2 * y
``````

Is there a trivial way to optimize the function with respect to x for a fixed y?

Thanks!

2 Likes

Would it work to define `y` to be some constant and change your code to `loss = square(params, y)`?

Best regards

Thomas

That definitely works.

Thanks, Tom.

DS