‘gelu_backward_cuda’ is not a member of ‘at::native’

Hi, it seems PyTorch 1.10 has changed its C++ API for gelu and softmax backward function. Previously, I can make customized backward functions by calling

at::native::gelu_backward_cuda(grad_output, input);
at::native::softmax_backward_cuda(grad_output, output, dim, self);

But now we get the errors:

error: ‘gelu_backward_cuda’ is not a member of ‘at::native’
error: ‘softmax_backward_cuda’ is not a member of ‘at::native’

Are there any solutions for that?

Kind regards

You should be able to replace them with gelu_backward_out_cuda and softmax_backward_cuda_out.

Thanks for your quick response!

I have tried to use gelu_backward_out_cuda and softmax_backward_cuda_out . However, the same errors still appear. I was wondering if the arguments for the two functions have changed as well?

My code:

return at::native::gelu_backward_cuda_out(grad_output, input);
...
return at::native::softmax_backward_cuda_out(grad_output, output, dim, self);

The errors:

error: ‘gelu_backward_cuda_out’ is not a member of ‘at::native’
error: ‘softmax_backward_cuda_out’ is not a member of ‘at::native’