Out-of-place ReLU operation in PyTorch Neural style transfer tutorial

What’s the point of changing the mode of nnReLU operation from inplace to out-of-place in PyTorch Neural style transfer tutorial. I dont think that it would make any difference even if the relu is inplace.

It’s mentioned in the document that,

# The in-place version doesn't play very nicely with the ContentLoss and 
# StyleLoss we insert below. So we replace with out-of-place 
# ones here.

Why is that exactly true?