How to sync module weights, but not gradients?

So i have encoder-decoder model and also gan discriminator.
Like this out = Generator.decoder(Generator.encoder(inp))
And discriminator_out = D_encoder(out)

I want to share weights between Generator.encoder and D_encoder because it have same layers and do kind of same job, but this modules should not make gan cheat (like make encoder worse so it would give wrong score to fake samples).

How should i approach it?