Use the same net for multiple inputs?

I am trying to implement an error function that depends on multiple outputs of the same neural net. Say I have a neural net A, and error function has the form:

E = A(x1) + A(x2) + A(x3) (this is just an example, it does not make much sense)

I wonder if it is OK to implement it in following way:

y1 = A(x1)
y2 = A(x2)
y3 = A(x3)
E = y1+y2+y3

Would it behave properly when I call E.backward()? Or more specifically, does the information about x1 and x2 get lost when I call y3=A(x3) so that the backprop result would be incorrect?


Yes, that’s OK. Gradient information is preserved when you call the same module with multiple inputs.

1 Like