autograd


Topic Replies Activity
ReduceLROnPlateau 2 July 14, 2019
Parameters added to pre-trained model have none grads 5 July 14, 2019
[Hessian Vector Product] RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior 5 July 13, 2019
Epoch is slow with an addition in loss function 2 July 13, 2019
Can't updating Weight Values in PyTorch .backward() 3 July 13, 2019
Doubt in Element 0 of tensors does not require grad and does not have a grad_fn 2 July 13, 2019
CUDA Invalid Configuration Error on GPU Only 5 July 12, 2019
Why the model fail to converge without a manul weight initialization? 1 July 12, 2019
Torch.multiprocessing rebuild_cuda_tensor having trouble with bn.num_batches_tracked 2 October 17, 2018
Autograd_output in a simple custom linear layer 3 July 11, 2019
Customized convolution in high dimension matrices 1 July 11, 2019
Loss.backward() raises error 'grad can be implicitly created only for scalar outputs' 7 July 11, 2019
How to access parameters using model's attributes' name 2 July 10, 2019
ConvNet+LSTM video classification: use of GPUs 1 July 10, 2019
.cuda() extremely slow after calling loss.backward() 7 July 10, 2019
Where can I find backward implementation of existing Function in pytorch 1 July 10, 2019
Element 0 of tensors does not require grad? 3 July 10, 2019
Running network twice with different set of parameters in single epoch/optimizer step 1 July 9, 2019
Getting nan after the first backward pass even after clipping the gradients 2 July 9, 2019
Defining parameters by some transformation OR Retaining sub-graphs, but not the whole graph 1 July 9, 2019
Gradient with respect to product of variables 3 July 9, 2019
Should it really be necessary to do var.detach().cpu().numpy()? 5 July 9, 2019
Loss and accuracy stuck, very low gradient 2 July 8, 2019
Implementing new Optimization algorithms 3 July 8, 2019
Warning: NaN or Inf found in input tensor 5 July 8, 2019
How to construct a matrix consisting of nn.Parameter and do matmul in correct way? 3 July 8, 2019
Remove part of loss function leads to out of memory Error 2 July 6, 2019
Help with the implement own autograd.Function 1 July 6, 2019
'gradient' argument in out.backward(gradient) 11 July 6, 2019
When is autograd available and is it always accurate? 4 July 5, 2019