Not backpropogating through neighbor examples in graph network backbone

I have a graph neural network which has an architure that is roughly:
Backbone → Graph network.
In order to compute an embedding for neighbor nodes, I have to pass them through the backbone, however I don’t want to update the backbone based on gradients from neighbors–only the targets. Is there a way to achieve this?

I think the most simple approach would be to run a forward on the backbone on neighbor nodes and then just pass in the embeddings, but that seems a bit clunky.

If you want to backward to a particular set of parameters, you can specify those parameters using the inputs= parameter .backward().

1 Like

Yeah that works, but then I have to do more work in the backwards method (like backpropping through groups of params individually, which can get annoying and is hard to do with lightning).