Backward in loss with input F(G(x)) with fixed F

Hi,
I’m trying to optimize a loss function called Loss here relative just to weights of CNN network named G ) but I have function F that is a pre-trained CNN with also some past-processes (fixed but non-differentiable) that I want to be considered fixed . but when I set required_grad False for this , error occurs also when I set that True the weights in F will be changed .How you do this idea? Can you help me? Thanks a lot.

Loss=nn.crossentropyloss(F(G(x),F(x))
Loss.backward()

Can you include some reproducible code that highlights the error you get? In principle you should be able to set requires_grad to False, would be useful to see specifically what error you’re getting.