Hi Varun!
This error message is telling you (from the tensor shape) that
block3[3].weight
is being modified inplace.
optimizer3.step()
modifies inplace the Parameter
s it is optimizing.
What’s going on is that you modify block3
. But loss
depends on l2
,
so when you call loss.backward()
, you backpropagate through block3
again, hence the error.
(You have other similar errors, but when this first error is detected, the
call to .backward()
exits.)
I don’t understand the rationale behind what you are doing, but a fix might
be as simple as first calling all of your .zero_grad()
s and .backward()
s
and then calling all of your optimizer.step()
s.
Please also take a look at this post that explains how to debug such
inplace-modification errors:
Best.
K. Frank