How do I find out what the version numbers of a Module’s Parameters are?
I know that pytorch keeps track of this information, because it will raise an exception if you try to backpropagate through a graph that was based on Parameters that have been updated since.
I need this because I have a pretty complex and unusual computational graph that is dynamically generated, and I would like to perform sanity checks on the version numbers.
You can access these with the read-only
._version attribute on Tensors.
Note though that this is an internal implementation. And the absolute values might change (though equal or not equal will of course still be true).
Two follow-up questions:
- I want to store the version numbers of all Module parameters at certain points, and then later check if any of these checkpoints used Parameters that are now outdated. Will this work reliably?
- Can I assume that this gets serialized correctly when using
state_dict(), and will continue to work after reloading? Or can this cause the version numbers to jump and maybe even become smaller?
Yes checking if they changed or not will always work reliably.
Since the deserialized Tensor doesn’t share data with the original one anymore. There will be no link between their version counters. So I would expect that it gets reset to 0 on loading.