I have a use-case where I need to do other operations on a sliding window.
For example, regular convolution calculates the dot-product between the kernel weights and the sliding window, but now I want to calculate the euclidean distance (or sum, or …) of the kernel weights with the sliding window. Is there a nice way to do this?
That is using the sliding window functionality of the convolutions layers but replacing the dot product with something else.

I think I found an answer.
It seems like the right way to do it is to use unfold and fold functions.
With the unfold function one can create the same sliding window slices that a convolution operation slides over.
But, one thing that remains unclear is how I can apply my own function (e.g. euclidean distance) efficiently (e.g. without a python for-loop) over those slices.