I have not seen named tensors in any project I have viewed but they seem almost as good of a practice as writing tests. What is the reason they have not caught on? Are there any cons to using them besides taking the time to write them?
- Implementation is incomplete, and I don’t think anyone works on it
- It is oddly non-polymorphic, i.e. lots of operations are just blocked when names are present. Not only ops with dim=int arguments, but also things like unsqueeze(). If these could re-synchronize names instead, it would be much easier to include code with named tensors into existing codebases.
- Related to above, they’re not compatible with many standard routines (first of all, stuff from nn namespace). For example, while F.linear/mm works, convolutions or rnns fail.
- Most of ML code doesn’t require dimension permutations, I guess people working with BCHW tensors don’t care about dimension names that much.
1 Like