In the forward method of my nn.Module, I have these 2 lines :
views = [z] + [transformation(z) for transformation in self.model.transformations]
representations = torch.stack([self.model.encoder(view) for view in views])
where self.model.transformations is a list of nn.Modules (Convolutions actually).
By profiling my code, I seed that these formulation as a list comprehension might be problematic in term of speed.
Is there a pytorch function/module that would take a list of modules and, in a fast way, apply it to my entry z and then stack the resuls ?
Thanks