Additional Argument for Parameter in Custom Optimizer

Hi!
I am writing a custom optimizer, which is based on Adam. To be able to optimize it properly, I need to specify an additional argument i.e. curvature of the space for each parameter being optimized.
Ideally, I would like to have something similar to this (code taken from PyTorch Adam implementation):

for group in self.param_groups:
        for p in group['params']:
           state = self.state[p]
           curvature = state['curvature']

Unfortunately, I could not figure out how to do something like that even after looking into source code. Is there any way to do that except for implementing custom Parameter subclass and custom Tensor subclass?