Understanding code organization: where is `log_softmax` really implemented?

Hi –

So, I’m new to PyTorch, and I’m spending a lot of time in the docs. Recently, I was digging around trying to find out how log_softmax is implemented.

I started out looking at the source for torch.nn.LogSoftmax, which is implemented with torch.nn.functional.log_softmax. OK, so I went to the docs for that and clicked the source link, and found that this function is implemented by using the method .log_softmax of its input.

I figured that this meant that .log_softmax would probably be a method of torch.Tensor, so I went to the docs for Tensor, but was unable to find any such method. I’m wondering, where is this method implemented?

I’d also appreciate any general insight into how torch's organization influences the location of this code/other code like it.

Thank you!

1 Like

You’ll find the CPU implementation here.

PS: rgrep is often a useful tool to find the corresponding function.
The GitHub search function is often also quite helpful.

1 Like