Using the model in real life

How would you like to deploy your model?
If you would like to feed the inputs “live” , you wouldn’t necessarily need a DataLoader.
@jspisak mentioned Flask as a valid approach here, if you don’t need low latency and also highlights a proposal of Amazon for a high performance PyTorch C++ serving platform.