Forcing an object to be regarded as a Parameter

Assume, I have created an object as is described in
Extending AutoGrad - Extending torch with a Tensor wrapper type

I do not need any checks, I just need it to be passed to a custom AutogradFunction that calculates the backward (again without any checks of the type or other things)
and be recognized by a custom optimizer as a parameter. The nn.Parameterdoes not work and self.register_parameter also requires parameters.

Is there any way, to brute-force write the tensor as a parameter such that it appears in the model.parameter() list and gets the update signal by the optimizer?

You don’t need a tensor to be a nn.parameter. You can pass any tensor to the optimizer and it will work.

1 Like