autograd


Topic Replies Activity
About the autograd category 1 May 13, 2017
Gradient flow stopped on a combined model 1 October 22, 2019
How do I backpropagate through a module's parameters? 3 October 22, 2019
RuntimeError: Function 'SqrtBackward' returned nan values in its 0th output 3 October 22, 2019
Compute the Hessian matrix of a network 18 October 22, 2019
RuntimeError: "argmax_cuda" not implemented for 'Bool' 3 October 21, 2019
Why is softplus backward pass not equal to sigmoid? 3 October 21, 2019
One-hot preds: element 0 of tensors does not require grad and does not have a grad_fn 5 October 21, 2019
Partial derivative (torch.autograd.grad) within torch.no_grad() 5 October 21, 2019
Backward fails when I use a composition for torch.nn.Parameter 3 October 20, 2019
Back propagation multiple times before optimizer.step() 2 October 20, 2019
Keep getting RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn 3 October 20, 2019
Should i do loss.backward() or loss.mean().backward() 10 October 20, 2019
Two output nodes for binary classification 4 October 20, 2019
I tried .backward(retain_graph=True) twice 3 October 19, 2019
Gradient of output wrt specific inputs 3 October 18, 2019
Does the generated grad-graph RETAIN inside a for-loop? 2 October 18, 2019
How to free the graph after create_graph=True 7 October 18, 2019
How to track and add autograd computation graphs for buffers 12 October 18, 2019
Confusion with LR Scheduler step() In Pytorch 1.1.0 and Later 1 October 18, 2019
Gradients with Argmax in PyTorch 6 October 18, 2019
Custom loss function using two different data sets 6 October 17, 2019
How to access the gradient from F.interpolate 5 October 16, 2019
Example for One of the differentiated Tensors appears to not have been used in the graph 3 October 16, 2019
Undestanding of mergin two modules 4 October 16, 2019
Confusion about tensor's grad when execute backward 2 October 16, 2019
How to use the 'weight' -- loss functions parameter 5 October 16, 2019
How to normalize the output tensor to [0,1] and then calculate the SSIM between channels as loss? 4 October 16, 2019
out.backward(torch.Tensor([2.0])) doesn't work in Pytorch 1.0.3 but works in Pytorch 1.0.2 5 October 15, 2019
Loss.grad is None for Custom loss function 5 October 15, 2019