Autograd supported operations

Hi All,

I have been reading through autograd recently.

Which operations I should be using in order for gradients to be correctly calculated by autograd?

Thanks!

Hi,

You should be using torch implementation of any function you use.
All pytorch function will work. Or raise an error if you’re asking for things that are not supported/non-differentiable.

Thanks, so if I use an operation or function that is not support by autograd, an error is raised?

Yes.
All differentiable functions should be implemented though. Only a few linear algebra functions that could be differentiated are not.
Of course non-differentiable function won’t work.
And the functions like .detach() and torch.no_grad() are built to do things outside of the autograd so they are special cases
.

Thanks, just to clarify. Using autograd supported operations and functions allows gradients to be correctly calculated?

Yes it will compute the correct gradients if you don’t get any error :slight_smile:

And if gradients cannot be correctly computed, I get an error?

If the function is not differentiable, you’ll get an error.
If gradients cannot be computed (usually due to inplace ops), yes you will get an error.
In general, if a gradient is computed, it is correct.

1 Like