Hi all,
We’ve created something really cool that I think this community especially will like!
I got creators and core contributors behind ALL of the model training libraries in PyTorch Ecosystem to talk about their projects, specifically:
- Philosophy
- API structure
- The learning curve for new users
- Build-in features (what you get out-of-the-box)
- Extension capabilities (simplicity of integration in research)
- Reproducibility
- Distributed training
- Productionalization
- Popularity
The result is this blog post.
Hopefully, with first-hand info from people behind fastai, lightning, ignite, skorch, catalyst and torchbearer, it will be easier to choose some of those libraries to improve your deep learning projects.
If you have any additional questions, stuff we didn’t mention that is interesting to you, drop a comment!