TVTensor, hooks and Regularizers

  1. I was just going through the documentation for transforms v2 ( torchvision ) which stated that it added TVTensors , could anyone please explain why were they introduced and what exactly did it improve? ( I was not able to figure this out from docs ).

  2. Also why isnt there any documentation about hooks on torch :confused: I am having to visit Medium posts / previous threads on this forum to learn more…

  3. Torch has no Kernel regularizers / Activity regularizers like TF… Ofc Using weight decay is a walkaround for L2 regularization, But won’t it make it difficult for me to apply it on a per-layer basis ( Ofc there is per-parameter option via optim)? And as far L1 regularization / Activity regularization goes would the best option be to use hooks and add them to desired layers, because most the of the torch / stack overflow threads ask us to just use .norm() on desired layers and and it to the total loss ( But this way won’t the total loss be backpropogated through all the parameters? )

Details about TVTensors can be found here.

Your other points sound valid, so it would be great if you would be interested in contributing more docs etc.

Awesome! Sure will do it :slight_smile:

1 Like