Add "sincos" layers in the nn

I am wondering if there is existing function allow us to add two nodes sin and cos in a net like an activation function?

I’m not sure I understand the use case correctly, but in case you want to apply torch.cos and torch.sin on an activation you could directly do it via:

def forward(self, x): # forward method of your model
    ...
    x = self.layer(x)
    ...
    x = torch.sin(x)
    x = torch.cos(x)
    ...
    return x

or alternatively you could also create a custom nn.Module and use these operations as an “activation” function.

1 Like

Thank you ptrblck, let me explain my question more clarify. I am implementing a GAN, in order to endure the generator output to be periodic. I want to add a “layer” as the last layer of generator to change x into (sin(x), cos(x)). I want this “layer” is part of of backprop as well. since the dimension is changed, so I use torch.cat to solve the problem:

    def forward(self, noise):
        angle = self.gen(noise)
        output = torch.cat((torch.sin(angle), torch.cos(angle)))
        return output

However there are two question:

  1. if the torch.cat() function is allowed to do backprop? I find the value of grad_fn become to <CatBackward>
  2. I checked the output of generator after adding the output line in forward function, but didn’t see any change. I must miss something about the forward function. could you please give me some guide (or a link) so I can figure it out.

Thank you very much !!

  1. Yes, torch.cat has a valid grad_fn and will not break the backpropagation.
  2. What did you compare the outputs against? Your forward looks good and the added operations should be used. You could add additional print statements to the forward to make sure it’s really called and to check intermediate outputs.
1 Like