TracerWarning: Converting a tensor to a Python index might cause the trace to be incorrect...This means that the trace might not generalize to other inputs!

I am trying to create trace file. Related code part is here:

def center_crop(self, layer, target_size):
    _, _, layer_height, layer_width = layer.size()

    diff_y = (layer_height - target_size[0]) // 2
    diff_x = (layer_width - target_size[1]) // 2
    return layer[:, :, diff_y:(diff_y + target_size[0]), diff_x:(diff_x + target_size[1])]

def forward(self, x, bridge):

    up = self.up(x)
    crop1 = self.center_crop(bridge, up.shape[2:])
    out = torch.cat([up, crop1], 1)
    out = self.conv_block(out)

It gives the warning at this line:

return layer[:, :, diff_y:(diff_y + target_size[0]), diff_x:(diff_x + target_size[1])]

Then, I loaded this trace file in C++ and made prediction for a test image. Then I compared the result with the output in python. I observed that results are very different. I guess that the reason originates from that warning.

2 Likes

The tracer uses the example inputs you provide and records the operations. If a different input would result in a different operation, then this will not be captured by the tracer. Here it’s recording the arguments to the slice operation as constants (so it will always slice it with the values produced with the example inputs).

To get around this, put any code with data-dependent control flow inside a ScriptModule, and then call that in your traced code.

1 Like

@driazati I added a helper function and wrote this code:

@torch.jit.script
def center_slice_helper(layer, diff_y, diff_x, h_end, w_end):
   return layer[:, :, diff_y:h_end, diff_x:w_end]

def center_crop(self, layer, target_size):
    #_, _, layer_height, layer_width = layer.size()

    diff_y = (layer.shape[2] - target_size.shape[2]) // 2
    diff_x = (layer.shape[3] - target_size.shape[3]) // 2

    h_end = diff_y + target_size.shape[2]
    w_end = diff_x + target_size.shape[3]

    return center_slice_helper(layer, diff_y, diff_x, h_end, w_end)


def forward(self, x, bridge):

    up = self.up(x)
    crop1 = self.center_crop(bridge, up)
    out = torch.cat([up, crop1], 1)
    out = self.conv_block(out)

    return out

At this time, it gave this exception:

torch.jit.TracingCheckError: Tracing failed sanity checks!
Encountered an exception while running the Python function with test inputs.
Exception:
	center_slice_helper() expected value of type Tensor for argument 'diff_y' in position 1, but instead got value of type int.
	Value: 0
	Declaration: center_slice_helper(Tensor layer, Tensor diff_y, Tensor diff_x, Tensor h_end, Tensor w_end) -> Tensor

I debugged the code and observed that x parameter of forward function has this type:

How to handle this?

@driazati I have read the related part in documentation: https://pytorch.org/docs/stable/jit.html#mixing-tracing-and-scripting

I understood that I should use MyPy-style type annotation. What is your opinion?

Our support for non-tensor types in tracing is pretty limited (see #14455), but your model looks like using script mode would work well. See something like:

@torch.jit.script
def center_slice_helper(layer, diff_y, diff_x, h_end, w_end):
    # type: (Tensor, int, int, int, int) -> Tensor
    return layer[:, :, diff_y:h_end, diff_x:w_end]

class M(torch.jit.ScriptModule):
    @torch.jit.script_method
    def center_crop(self, layer, target_size):
        #_, _, layer_height, layer_width = layer.size()

        diff_y = (layer.shape[2] - target_size.shape[2]) // 2
        diff_x = (layer.shape[3] - target_size.shape[3]) // 2

        h_end = diff_y + target_size.shape[2]
        w_end = diff_x + target_size.shape[3]

        return center_slice_helper(layer, diff_y, diff_x, h_end, w_end)

    @torch.jit.script_method
    def forward(self, x, bridge):
        crop1 = self.center_crop(bridge, x)
        return crop1

m = M()

Hello,I’ve been having this problem lately,How did you solve it?