Constraining parameter vector to unit sphere

I am creating a class for my model using nn.Module. It’s a simple model with only a set of linear filters as parameters. The parameters are self.f, defined by self.f = nn.Parameter(f_init) where self.f has shape (n_filters, n_dims).

The issue is that I want to constrain the filters to have unit norm. So far I am doing this with the wonderful geotorch package, using the call geotorch.sphere(self, "f"). Then I don’t have to think of this constraint when doing back-propagation.

My question is whether there is a simple way of implementing this constraint without geotorch, since it seems simple enough, and to avoid an extra required package that is not in pip. Naively, it occurs to me that I could just divide self.f by its norm after every update step. It is not obvious to me though whether this would mess with my gradient and learning somehow, although I don’t see why they should.

So, does it sound right to impose the unit-length constrain by dividing the vectors after each update step?

A quick check of the code indicates torch.nn.utils.parametrize.register_parametrization is used which is then registering Sphere on the parameter, so you could try to use the same approach.

1 Like

I solved the problem using the parametrizations tool in Pytorch. I wrote a blogpost about how you’d implement the unit vector constrain I mentioned, and also how to constrain a matrix to be symmetric positive definite Easy constrained optimization in Pytorch with Parametrizations - Daniel Herrera-Esposito

1 Like

Really cool post! Thanks for sharing it here.

1 Like