torch.jit.trace doesn’t allow us to trace a module that has shared parameters.
Does someone know the reason?
I guess it aims to prevent the shared parameters from being concurrently updated (and destroyed).
If so, I think parameter sharing is safe when we know the modules that access the parameters are NOT concurrently run.
For example, modules that run in a sequence never update their parameters concurrently.
Could someone tell me if I can use tracing in that case (by removing the detection of shared parameters).