RuntimeError when trying to backprop through slicing operations

Hello everyone,

I’m trying to backprop through a dynamic programming algorithm. The algorithm is definitely backpropable, but the forward pass involves operating on tensor slices, which is not a standard scenario for autograd. The arising RuntimeErorrs don’t inspire much condifence. I have isolated a small example highlighting the issue.

# Allocate random 3 x 3 x 2 energy volume of values from [0, 1]
height, width = 3, 3
ndisp = width-1
torch.manual_seed(0)
energy_init = torch.rand(height, width, ndisp, requires_grad=True)

# Clone to avoid "RuntimeError: leaf variable has been moved into the graph interior"
energy = energy_init.clone()
torch.autograd.set_detect_anomaly(True)

#######################
# Copy operation here
#######################

# Backprop and print gradients
energy_sum = energy.sum()
energy_sum.backward()
print(energy_init.grad)

For the following 4 operations, the code executes without errors and computes the correct gradients:

energy[:, 0] = energy[:, 1] 
energy[:, 0] = energy[:, 0] + energy[:, 1]
energy[:, 0] = torch.minimum(energy[:, 1].clone(), torch.tensor([2]))
two = torch.ones_like(energy[:, 1]) * 2
mask = (energy[:, 1] < two)
energy[:, 0] = mask * energy[:, 1] + (~mask) * two

This one, however, generates a RuntimeError:

energy[:, 0] = torch.minimum(energy[:, 1], torch.tensor([2]))

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [3, 2]], which is output 0 of SelectBackward, is at version 1; expected version 0 instead.

It seems that only torch.minimum causes issues, and other binary operations like addition are fine. I’m not sure why autograd considers the computation be “in-place”.

My main concern is that the correctness is somehow compromised. But also, since pointwise minimum operations are performed lots of times, it would be preferable to avoid cloning the slice every time.

Is the code correct and is there a less hacky way to resolve the issue?

Thank you in advance.