Are there any cases where ReLU(inplace=True) might cause a silent problem?

Are there any cases where ReLU(inplace=True) will do something unexpected without warning about it? From this question, it was suggested that if it doesn’t cause an error, we shouldn’t expect any problems. However, I’m wondering about slightly more complex circumstances, specifically when using double backward (in a WGAN). In these kinds of cases, is it possible that using ReLU(inplace=True) might end up using the wrong number at some point? I suppose I’m also just a bit confused about how the backward is calculated after the inplace operation has taken place.

If there isn’t any error, it should be fine in terms of autograd correctness (unless you are fiddling with .data).

Thanks. Do you happen to have a simple explanation of how the calculations are done inplace, but that the backward can still be calculated? That is, if values are being overwritten, how are the values used for the backward? Or does the backward just infer the original value based on the node? (For ReLU, this would make sense as it’s just deciding whether or not any gradient should be passed at all)

Yes, for things like relu, dropout it is easily decided by looking at the locations of non-zero values. This can’t be done generally though. If the input is needed to compute gradient in some backward function, then inplace operations cannot be used.