Why this code will cause an inplace operation?

The code is listed below.

fake_w = torch.max(grid[:, :, 0])
fake_h = torch.max(grid[:, :, 1])

I used the detect_anomally to detect the forward error, and it told me that the line of fake_h had an in-place operation.

In my opinion, those two lines are the same. Why only the second line caused the error while the first line passed? And the second question is why the second line is an in-place operation?

Hi ZiQing!

This code does not perform an inplace operation (unless fake_w or
fake_h refer to a view into a pytorch or are some similar alias).

Could you post a complete, runnable script that illustrates your issue?

Best.

K. Frank

@KFrank
Thanks for your quick reply! And here’s the full code of the function.

        min_grid_ = torch.min(grid)
        fake_w = torch.max(grid[:, :, 0])
        fake_h = torch.max(grid[:, :, 1])
        grid[:, :, 0] = 2.0 * grid[:, :, 0].clone() / max(fake_w - 1, 1) - 1.0
        grid[:, :, 1] = 2.0 * grid[:, :, 1].clone() / max(fake_h - 1, 1) - 1.0
        max_grid = torch.max(grid)
        min_grid = torch.min(grid)

The shape of grid is [B, 3778, 2]. The error is listed below.

[W python_anomaly_mode.cpp:104] Warning: Error detected in MaxBackward1. Traceback of forward call that caused the error:
....................................
....................................
File "/home/xywang/xywang/hj-hassony2/meshreg/models/meshregnet.py", line 587, in gridsamplefun
    fake_h = torch.max(grid[:, :, 1])

And I still have a question that in some codes why people do .clone() when operating on the same elements (e.g. grid[:, :, 0] = 2.0 * grid[:, :, 0].clone())?