The error you are seeing will be raised if you try to call backward
on the output of any of the 3 models since you have frozen all trainable parameters by setting their .requires_grad
attribute to False
.
If you want to compute the gradients and thus need to call backward
on any output/loss, you would have to make sure some parameters are still trainable.
1 Like