How to compute loss in ROI region of a tensor?

I have a tensor size of 16x16. I only want to compute the loss and update gradient in a region of interesting (ROI) of the tensor such as ROI[4:12,4:12]=1. I used the below code but it shows an error. How to do it? Thanks

import torch

input = torch.rand((16,16),requires_grad=True)
print (input)
mask = torch.zeros_like(input)
mask[4:12,4:12]=1.0
input = mask* input

d = torch.mean(input)
d.backward()
print (input.grad.data)

Error is

AttributeError                            Traceback (most recent call last)
<ipython-input-35-9826ca6fcef5> in <module>()
     10 d = torch.mean(input)
     11 d.backward()
---> 12 print (input.grad.data)

AttributeError: 'NoneType' object has no attribute 'data'

It looks that the input.grad is None

Hi John,

Here,

input = mask* input

you are replacing (and hiding) the real input (leaf node) of your graph. If you want to access the gradients of the input, you should keep the original input reference and rename the masked input:

import torch

input = torch.rand((16,16),requires_grad=True)
print(input)
mask = torch.zeros_like(input)
mask[4:12,4:12]=1.0
masked_input = mask * input 
d = torch.mean(masked_input)
d.backward()
print(input.grad.data)

Hope that helps!

Additionally, if you ever want to access the masked_input gradients, which is non-leaf Tensor, you will have to to call masked_input.retain_grad() after its creation.
Please find more info about it here

Thanks @spanev. It worked but the result looks wrong. If I ignore the region in mask, so the expected mean grad should be 1 instead of 0.11111

What do you mean by ignoring the region in mask? Which gradient should be 1? d.grad will be 1 since we run backward on it.

In d = torch.mean(masked_input) all the inputs (even the zero’d ones) will contribute to the mean, so the final gradients will be 1/(16*16) on the non masked elements, and 0 on the masked/zero’d ones.

I means I want to set grad on the outside ROI to zero, it means we will untouch the region