Group nodes in tensorboard

Hey,
I’m wondering if it is possible in Tensorboard with pytorch to define groups of nodes that are displayed as one node. E.g. in TensorFlow it is possible to group and name nodes in Tensorboard using tf.name_scope (as far as I understood) as explained here. This is very helpful for having a better overview if you have several ‘units’ in your graph. A simple example, in a feed forward NN I could wish to group all the hidden layers in a node called ‘representation’ and the output layer in a node called ‘classifier’.
Is this somehow possible in pytorch? For me it seems like a very important feature yet I could’t find anything.
Cheers,
Spyderman

Ok so I figured it out myself. You will get a node called ‘representation’ just by putting all your representation layers into an nn.sequential() with the variable name ‘representation’. So this is very intuitive. In fact, I already had it this way, I just didn’t recognize it because my model had so many parts that the graph was a bit messy.