Variable inside loop

Hi
is necessary in this case put the autograd Variables inside the loop of the epoch?
The values of Variables wrapper , automatically are computing by autograd?

Thanks a lot !

The Variables are just necessary to wrap inputs/targets so that autograd functions won’t complain about them. Gradients will not be computed for inputs / targets because their variables do not have requires_grad=True