Pytorch-mighty package monitoring the training progress

Dear PyTorch users,

I’d like to present a package of mine that automates monitoring the training process:

Initially developed as a tool to help other projects of mine, I thought it could be valuable to some of you. The idea is similar to Pytorch Horovod and many other packages that automate the training process, therefore, I want to focus on the features that make the package particularly interesting:

  1. trainer.restore(checkpoint_path=None): seamless restoration of the training progress AND the monitor plots. Imagine that you saved the progress yesterday with 10+ infographic plots, wake up today and want to continue adding metric points (loss, accuracy, etc.) on the same figures as nothing happened.
  2. Several Mutual Information estimators: that’s where I put the most effort into. If you’re interested in information theory, how the information flows between layers across epochs, you will find a separate repository (linked there) with Mutual Information estimators benchmarks.
  3. More than just accuracy and loss curves. One of the reasons why I developed such an advanced monitoring system is that I was not satisfied with the basics plots of accuracy and loss VS epoch, and I monitor several other metrics.

I work with vision only, so I don’t know how much this package is applicable in other domains. While it’s not necessary to use pytorch-mighty as it is, you may find a number of interesting monitoring plots that you can copy-paste into your projects.


1 Like