A parameter not added to the computation graph how can I get update his grad

I want to define a custom layer, the layer has a parameter weight, the layer contain several Conv2d operation, and the Conv2d’s weight is calculated based on the weight I defined before, the custom layer weight’s grad is based on the Conv2d weight’s grad, now I have problem to upgrade the custom layer’s weight. As the custom layer weight is not included in the compute graph, when I want to compute it’s grad it shows the type of weight.grad is <type ‘NoneType’>, how can I update the parameter’s grad not included in the computation graph and use the solver to update the parameter.

I can’t understand what you want to achieve. Could you share your code with us?

I want to decompose the layer’s weight by several weight, use several conv2d operation to represent the conv2d layer, like this:

W=\alpha_{1}B_{1}+…+\alpha_{m}B_{m}
output = \sum_{i=1}^m\alpha_{i}Conv(B_{i},input)

As the W doesn’t join the computation directly, I don’t know how to update it.