Autograd when the NN weights are sampled via NN sampler

Hi,

I have two NNs. The first NN generates random weights which are used in a second NN to compute the loss. How can I get the derivative w.r.t. the parameters of the first NN?

I tried using vector_to_parameters which allows me to compute the loss, but the derivative of interest is none.

Hi Toobayes!

Pass your second NN, the weights generated by your first NN, and the
input you wish to pass to your second NN into func.functional_call().

This will return the output of your second NN, but with it using the weights
generated by your first NN. You may then compute your loss from this
output and backpropagate through functional_call() to get the gradient
of your loss with respect to the parameters of your first NN.

Best.

K. Frank