Hi,
Do torch.stft and torch.istft support autograd? Is there a way to back-propagate through them?
Thanks
Hi,
Do torch.stft and torch.istft support autograd? Is there a way to back-propagate through them?
Thanks
Yes, Autograd is able to backpropagate through these methods as seen in this small example:
input = torch.randn(16, requires_grad=True)
out = torch.stft(input, 10)
out.mean().backward()
print(input.grad)
# tensor([-3.5224e-02, 1.5741e-01, -6.3721e-02, 1.5741e-01, -2.8497e-02,
# 1.4815e-01, -1.8626e-09, 9.2593e-02, -1.8626e-09, 9.2593e-02,
# 2.8497e-02, 1.0185e-01, 3.5224e-02, 5.5556e-02, 6.3721e-02,
# 2.7778e-02])
Hi,
Thanks for your reply. I tried your example but got
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/torch/_tensor.py", line 307, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/torch/autograd/__init__.py", line 154, in backward
Variable._execution_engine.run_backward(
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
My pytorch version is 1.10.2+cu113. Does it mean I need to update to a newer version?
Thanks.
No, I don’t think you need to update as my code snippet also works in 1.10.2
:
>>> import torch
>>> torch.__version__
'1.10.2+cu113'
>>> input = torch.randn(16, requires_grad=True)
>>> out = torch.stft(input, 10)
>>> out.mean().backward()
>>> print(input.grad)
tensor([-3.5224e-02, 1.5741e-01, -6.3721e-02, 1.5741e-01, -2.8497e-02,
1.4815e-01, -1.8626e-09, 9.2593e-02, -1.8626e-09, 9.2593e-02,
2.8497e-02, 1.0185e-01, 3.5224e-02, 5.5556e-02, 6.3721e-02,
2.7778e-02])
so I guess you might not have used my code snippet directly or you might have disabled the gradient calculation with a context manager or globally.
Yes it works. I mistakenly disabled the gradient calculation. Thank you very much!