Add "sincos" layers in the nn

I am wondering if there is existing function allow us to add two nodes sin and cos in a net like an activation function?

I’m not sure I understand the use case correctly, but in case you want to apply torch.cos and torch.sin on an activation you could directly do it via:

def forward(self, x): # forward method of your model
    ...
    x = self.layer(x)
    ...
    x = torch.sin(x)
    x = torch.cos(x)
    ...
    return x

or alternatively you could also create a custom nn.Module and use these operations as an “activation” function.

1 Like

Thank you ptrblck, let me explain my question more clarify. I am implementing a GAN, in order to endure the generator output to be periodic. I want to add a “layer” as the last layer of generator to change x into (sin(x), cos(x)). I want this “layer” is part of of backprop as well. since the dimension is changed, so I use torch.cat to solve the problem:

    def forward(self, noise):
        angle = self.gen(noise)
        output = torch.cat((torch.sin(angle), torch.cos(angle)))
        return output

However there are two question:

  1. if the torch.cat() function is allowed to do backprop? I find the value of grad_fn become to <CatBackward>
  2. I checked the output of generator after adding the output line in forward function, but didn’t see any change. I must miss something about the forward function. could you please give me some guide (or a link) so I can figure it out.

Thank you very much !!

  1. Yes, torch.cat has a valid grad_fn and will not break the backpropagation.
  2. What did you compare the outputs against? Your forward looks good and the added operations should be used. You could add additional print statements to the forward to make sure it’s really called and to check intermediate outputs.
1 Like

Hello
How I can add a sinc convolution 1d layer before a pretrained wav2vec2xlsr model for ASR to enhance the model.
thanks in advance

You might be able to call into your custom layer before passing its output to the pretrained model. Alternatively, you could also try to add the custom layer directly into your model e.g. by replacing the first layer with an nn.Sequential block containing the new layer as well as the original one.

Could you explain it more with example about add sinc conv1d layer to wav2vec2xlsr model, I am new in this field.
I appreciate any help.

في السبت، ١١ نوفمبر ٢٠٢٣ ٧:١٣ م ptrblck via PyTorch Forums <noreply@discuss.pytorch.org> كتب:

Related discussion: [feature request, idea] Fused torch.sincos(x) or cossin(x) (maybe alias to complex exponential) · Issue #90559 · pytorch/pytorch · GitHub