how to define loss function in torch.optim.SGD()


I am totally new to PyTorch and I have a question regarding the optimization using SGD. I would like to distribute n points on the surface of a unit sphere in 5D space. for generating random points I use:
points = torch.randn((npoints, dim), requires_grad=True)

I would like to optimize these points using torch.optim.sgd() and I need to define a proper loss function for this problem in order to be used for loss. backward().
torch.optim.SGD([points], lr=0.001)
for comparing the quality of optimization I calculate the k nearest neighbors and also the standard deviation for that array.

I would appreciate any help in this regard.


I’m a bit confused about what you want to do. From my understanding, optimizers interact with loss functions via gradients, so you would never really "define a loss function in torch.optim.SGD.

If you have a function f that quantifies how well n points are distributed on a unit sphere, then you could do something like

optimizer = torch.optim.SGD([points], lr=0.001)
for _ in range(num_iters):
    loss = f(points)  # calculate loss
    loss.backwards()  # back propagate
    optimizer.step()  # update parameters
    optimizer.zero_grad()  # reset gradients