I am wondering if there is existing function allow us to add two nodes sin and cos in a net like an activation function?
I’m not sure I understand the use case correctly, but in case you want to apply torch.cos
and torch.sin
on an activation you could directly do it via:
def forward(self, x): # forward method of your model
...
x = self.layer(x)
...
x = torch.sin(x)
x = torch.cos(x)
...
return x
or alternatively you could also create a custom nn.Module
and use these operations as an “activation” function.
Thank you ptrblck, let me explain my question more clarify. I am implementing a GAN, in order to endure the generator output to be periodic. I want to add a “layer” as the last layer of generator to change x into (sin(x), cos(x)). I want this “layer” is part of of backprop as well. since the dimension is changed, so I use torch.cat
to solve the problem:
def forward(self, noise):
angle = self.gen(noise)
output = torch.cat((torch.sin(angle), torch.cos(angle)))
return output
However there are two question:
- if the
torch.cat()
function is allowed to do backprop? I find the value ofgrad_fn
become to<CatBackward>
- I checked the output of generator after adding the
output
line in forward function, but didn’t see any change. I must miss something about theforward
function. could you please give me some guide (or a link) so I can figure it out.
Thank you very much !!
- Yes,
torch.cat
has a validgrad_fn
and will not break the backpropagation. - What did you compare the outputs against? Your
forward
looks good and the added operations should be used. You could add additionalprint
statements to theforward
to make sure it’s really called and to check intermediate outputs.
Hello
How I can add a sinc convolution 1d layer before a pretrained wav2vec2xlsr model for ASR to enhance the model.
thanks in advance
You might be able to call into your custom layer before passing its output to the pretrained model. Alternatively, you could also try to add the custom layer directly into your model e.g. by replacing the first layer with an nn.Sequential
block containing the new layer as well as the original one.
Could you explain it more with example about add sinc conv1d layer to wav2vec2xlsr model, I am new in this field.
I appreciate any help.
في السبت، ١١ نوفمبر ٢٠٢٣ ٧:١٣ م ptrblck via PyTorch Forums <noreply@discuss.pytorch.org> كتب: