Hey, I want to intercept before the before the optimizer.step() call where the weights are updated.
My issues are at when trying to access the values. Escpecially the following:
- Access input/output weights of all layers.
- Access loss of all layers
- Access activations of all layers.
- Set those values to new values.
If found this: [How to split backward process wrt each layer of neural network?] , but it says that the solution is just a hack. What would be a propper approach to handle this?