# Patch wise agreement of binary images

Let’s say I have two very sparse binary images (0 is much more frequent than 1) and I want to calculate the per-pixel precision and recall of the two masks agreeing with each other. Basically asking: Where are the masks equal given that a pixel in A is true and the same given for B is true. We could do that along the lines of:

``````p = (a == b & a == 1).mean()
r = (a == b & b == 1).mean()
``````

Now, I want to do this patch wise using a window. For example, given that pixel at i, j in image A is true. Is any pixel equal in image B given a region (KxK) around i, j? I also want to do this batchwise of course.

Is there any way doing this vectorized in pytorch? Currently, I can only come up with solutions using 3 for loops. I feel like this can somehow be done using maxpooling and a custom convolution kernel.

Hi Pietz!

Apply your `a == b` test to a max-pooled version of `b`:

``````>>> import torch
>>> torch.__version__
'1.10.2'
>>> _ = torch.manual_seed (2022)
>>> b = (torch.randn (1, 1, 10, 10) > 1.7).float()
>>> b
tensor([[[[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 1., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 1., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 1., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[0., 1., 0., 1., 0., 0., 0., 0., 0., 0.]]]])
>>> torch.nn.MaxPool2d (3, stride = 1, padding = 1) (b)
tensor([[[[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 1., 1., 1.],
[0., 0., 0., 0., 0., 0., 0., 1., 1., 1.],
[0., 0., 0., 0., 0., 0., 0., 1., 1., 1.],
[1., 1., 1., 0., 0., 0., 0., 0., 0., 0.],
[1., 1., 1., 0., 1., 1., 1., 0., 0., 0.],
[1., 1., 1., 0., 1., 1., 1., 0., 0., 0.],
[0., 0., 0., 0., 1., 1., 1., 1., 1., 0.],
[1., 1., 1., 1., 1., 0., 1., 1., 1., 0.],
[1., 1., 1., 1., 1., 0., 1., 1., 1., 0.]]]])
``````

Best.

K. Frank

How didn’t I see that… Anyway, thanks a lot for your time!