Projecting embeddings on a hypershpere of radius 1

Hi all, I have been trying to run a triplet loss based neural network and have been facing difficulties in converging it to the global minima. I recently came across “FaceNet: A Unified Embedding for Face Recognition and Clustering” that mentions that the embeddings should lie on a hypershpere of radius 1 in the n dimensional space and I can’t seem to find a way to do this in my code. does pytorch have in-built functions like this ?

I think you can use nn.functional.normalize() to normalize the embeddings so that they lie on a hypersphere of radius. I think the below would work

import torch.nn.functional as F
normalized_embeddings = F.normalize(embeddings, p=2, dim=1)

that’s perfect. Thanks @AbdulsalamBande