Dataparallel model with custom functions

With this solution too, it’s not possible to call custom (non-“forward”) methods while ensuring data-parallelism (the whole batch ends up getting passed in the custom function call). One solution for this can be to call the custom functions from inside forward by doing a simple if-else where you just figure out which function needs to be called. This can be done by passing the function name to forward and calling the necessary function by comparing against a set of hardcoded names. One other thing, each of the tensors in the call to the forward should have batchsize in the 0th dimension (this is sometimes not the case when using RNNs – this FAQ(https://pytorch.org/docs/stable/notes/faq.html#pack-rnn-unpack-with-data-parallelism) also talks about it)