autograd


Topic Replies Activity
About the autograd category 1 May 13, 2017
Why does this function break the gradient tree? :thinking: 5 December 11, 2019
Loss abnormal after using batchnorm1d 1 December 11, 2019
Grad is always None for leaf variable 3 December 10, 2019
How to delete every grad after training? 10 December 10, 2019
How to save neuron values after activation and gradient with respect to activations using hook? 2 December 10, 2019
Get the gradient tape 6 December 10, 2019
Comparing BN output (FW and BW) on CPU and on GPU 3 December 10, 2019
Implementing multiple recomputations on top of `torch.utils.checkpoint` 4 December 9, 2019
Optimizer.step() disregards learning rate with multiple nn.Parameter() 6 December 9, 2019
How to detach() a rows of a tensor? 4 December 9, 2019
How to print the computed gradient values for a network 12 December 9, 2019
Trying to understand a C error: torch.autograd.detect_anomaly() magically removes the error 5 December 9, 2019
Custom nn.Conv2d 15 December 9, 2019
CUDA memory leakage 14 December 9, 2019
How to create a list of modulelists 1 December 8, 2019
Compute the Hessian matrix of a network 21 December 8, 2019
Whether to use detach or not while cloning intermediate tensors during training? 1 December 7, 2019
Compute gradient of bitwise OR 4 December 7, 2019
Passing Params to an Optimizer 5 December 6, 2019
How to do exponential learning rate decay in PyTorch? 4 December 6, 2019
Why not removing a register_hook() slows down the training gradually? 3 December 6, 2019
How to print CrossEntropyLoss of data 7 December 5, 2019
Do I need to zero_grad when I use torch.autograd.grad()? 2 December 5, 2019
Speed of different batch size 3 December 5, 2019
Jacobian gradient matrix between two images 4 December 5, 2019
Is this the right way to use Cross entropyloss 2 December 5, 2019
Calculate second derivative related to preactivations 6 December 4, 2019
Calculate the "backward" for only one Tensor 6 December 4, 2019
Backprop from given gradient 4 December 4, 2019