Is the higher level library for meta-learning compatible with pytorch's distributed libraries?

I want to speed up my code with distributed training for MAML. I am using higher and was wondering if their library even works with things like DDP, distirbuted rpc, distibuted autograd etc.

Note: this is being cross posted to increased visibility (no response in the gitissues after a week so perhaps someone in the forum knows :slight_smile: ):

Any update on this? Thanks!