Hi,
I’m trying to implement the gradient penalty in wgans. From what I understood, once (https://github.com/pytorch/pytorch/pull/1643) was merged with master it should be working. I installed from source today, and I’m getting a segmentation fault when I’m calling torch.autograd.grad. That being said, I’ve never used said function before, so maybe I’m doing it wrong. Any case, any help is welcomed
Thanks,
Lucas
(code) https://github.com/pclucas14/WassersteinGAN/blob/master/main.py