Variation of results

I have quantized version of resnet18 network. When do the inference, I fed a single image from sequentially sampled data loader.
I’ve tried to printed the image tensor and input to the first convolution layer,got some mismatches.

To print input image fed from data loader, I used:

img = image.detach().numpy()
img = np.transpose(img,(2,3,1,0))

To get the input to first conv layer:

layer_input={}
def get_input(name):
	def hook(model, input, output):
		layer_input[name] = input

	return hook

model.conv1.register_forward_hook(get_input('conv1'))

qdm = torch.nn.quantized.DeQuantize()
deqout = qdm( val )
deqout = deqout.numpy()
deqout = np.transpose( deqout, (2, 3, 1, 0) )

Image data:

tensor([[[[-0.5082, -0.3883, -0.4226, …, 0.9303, 0.3823, 0.6392],
[-0.6281, -0.6965, -0.4397, …, 0.8104, 0.5878, 0.2111],
[-0.5767, -0.1486, 0.0741, …, 0.7419, 0.8961, 0.2282],

Input to conv layer:

-0.52449334,-0.5619572,-0.7492762,-0.3746381,-0.41210192,-0.5619572,-0.41210192,-0.03746381,0.07492762,0.0,-0.26224667,-0.59942096,-0.18731906,-0.41210192,-0.7118124 ,-0.7118124

These should be close to each other. Just to confirm, your “img” input is after the normalize transformation mentioned here: https://pytorch.org/tutorials/advanced/static_quantization_tutorial.html

        transforms.Normalize(mean=[0.485, 0.456, 0.406],
                             std=[0.229, 0.224, 0.225]),