How can multiple tensors conveniently be stored in one object?

For a project of mine I need to bundle some torch.Tensors into one object, since they have a strong connection. Think something like this:

import torch

a = torch.zeros(5)
b = torch.ones(4)
c = torch.rand(3)

class MyObject:
    def __init__(self, a, b, c):
        self.a = a
        self.b = b
        self.c = c

my_object = MyObject(a, b, c)

MyObject is only used for convenient storage (think namedtuple). I have two requirements for MyObject:

  1. Regarding the utility methods, MyObject should act like a single torch.Tensor. For example my_object.cuda() should move all stored torch.Tensors to the GPU.
  2. If embedded into a torch.nn.Module it should be treated like another torch.Tensor. Thus, if I call state_dict() it should also extract the torch.Tensors stored in MyObject.

The only way I see to achieve this is is by making MyObject an nn.Module:

from torch import nn

class TensorStorage(nn.Module):
    def __init__(self, **attrs):
        super().__init__()
        for name, attr in attrs.items():
            if isinstance(attr, torch.Tensor):
                self.register_buffer(name, attr)
            else:
                setattr(self, name, attr)

    def forward(self):
        msg = (f"{self.__class__.__name__} objects are only used "
                "for storage and cannot be called.")
        raise RuntimeError(msg)

my_object = TensorStorage(a=a, b=b, c=c)

As far as I’ve tested this works for my purpose. However, being an nn.Module implies that it can be called like every other module. Although this can be caught with an error in the forward() method, it might be misleading for the user. My Questions are:

  1. Can someone think of a better approach to address my requirements?
  2. If not, can I use TensorStorage as implemented or will I get in trouble somewhere down the road?

Hello Philip,

If you’re interested in ongoing development we’re currently working on https://github.com/pytorch/nestedtensor as a first-class representation of collections of variably sized Tensors. I’m happy to talk further about this if you want and would love your feedback.

Thanks,
Christian

Hi Christian,

at first sight this looks exactly like what I imagined. Thanks for pointing me towards this. Unfortunately I need to finish my project first and I decided to go with the quirky work around at first. Afterwards I’m happy to try to achieve the same with nestedtensor. How would I get in touch for feedback or suggestions? Should I use this forum, a GitHub issue or something else entirely?

Hello Philip,

Ideally you’d create issues on the nestedtensor Github repo for wider requests, bugs etc. But I’ll also see your responses to this thread in case something is not aligned with those buckets of topics.

Thanks for giving it a shot! Just to set expectations: at this point the repository is still under heavy development so they will be the occasional bug or change in semantics and general sparsity in documentation. I expect to improve on this very quickly however.

Thanks,
Christian

1 Like