Hello,
I thought scripting a module with same precision (fp32) would make the exact same output as the original eager module.
It did when I converted directly but when I stored scripted modules using torch.jit.save() and then loaded again using torch.jit.load(), the output of loaded scripted modules became slightly different.
Does torch.jit.save() and load() change some operations?