This is expected since Autograd won’t be able to track 3rd party operations. You would either need to stick to PyTorch tensors or you could implement custom autograd.Function
s using e.g. numpy and implement the backward
pass manually.
This is expected since Autograd won’t be able to track 3rd party operations. You would either need to stick to PyTorch tensors or you could implement custom autograd.Function
s using e.g. numpy and implement the backward
pass manually.