I need to compute the approximate derivative of a tensor x
using finite differences, i.e. x[1:] - x[:-1]
. My problem is that my input tensor is multidimensional and that I need to compute those finite differences along different dimensions. I would thus need a generic function f(x, dim)
that basically do the following:
if dim == 0:
return x[1:] - x[:-1]
if dim == 1:
return x[:,1:] - x[:,:-1]
if dim == 2:
return x[:,:,1:] - x[:,:,:-1]
if dim == -1:
return x[...,1:] - x[...,:-1]
etc...
The only trick I found was to use index_select
, but it is much slower to define indices and iterate over them than using slices, as you can see below:
$ python3 -m timeit -s 'import torch; a = torch.randn(5, 2048, 5)' 'a[:,1:,:] - a[:,:-1,:]'
10000 loops, best of 3: 75.4 usec per loop
$ python3 -m timeit -s 'import torch; a = torch.randn(5, 2048, 5)' 'indices = torch.arange(a.size(1)); torch.index_select(a, 1, indices[1:]) - torch.index_select(a, 1, indices[:-1])'
10 loops, best of 3: 26.8 msec per loop
Does it exist a clean and fast way to solve this problem?