Opinion new (sinusoid) activation function

I would like to implement a sinusoid activation function in torch.nn.functional, this is motivated by the strong results demonstrated using the activation function in Implicit Neural Representations with Periodic Activation Functions (ArXiv, paper’s webiste, and Github repo).

I’m curious about whether this looks like a valuable contribution to PyTorch?


The general policy we have here is the following:

  • We expect the given paper to have a significant amount of citations and have proven to be valuable over time (we want the core library to remain small enough)
  • Exceptions can happen when it is hard/impossible to implement a given paper with the tools we currently provide.

In this case, you can use the regular sin function from torch right? What would be different if you add it to nn.functional?

Hi @albanD,
Thank you for informing me on the reasoning behind adding new features to the library. Keeping in mind the policy, I have to agree with you that there’s not much reason to add sin activation function as of yet.

1 Like