I want to measure the horizontal and vertical gradient of an image to use as a gradient loss.

To do so I will have to compute the difference between neighboring pixels for each pixel along vertical and horizontal direction -

̅ g(x,y) = 1/N ∑_((x’,y’)∈ Ω(x,y))〖g(x’, y’)〗

Where:

g(x’,y’) is the absolute value of the horizontal/vertical gradient at a pixel location (x’,y’),

omega(x,y) is the neighborhood of the pixel location (x’,y’) containing (x’ + 1,y’) and (x’,y’ + 1) only

Is there an efficient way for this for a single channel target image (like a segmentation mask)?