How to demonstrate (or teach) learning theory using PyTorch

Hi,

I was wondering if any repo’s or notebooks are in the eco-system which use PyTorch to demonstrate theory, rather than do experiments, (although the border between the two is a bit vague).

There’s a lot of beautiful theory on function approximation, for example,

Learning Real and Boolean Functions: When Is Deep Better Than Shallow

is quite readable.

I think it builds on Vapnik–Chervonenkis theory.

@smth - Vladimir Vapnik is at FAIR now, (I think)? So I guess there might be some, “in house teaching courses”, that covers statistical learning theory, and is implemented using PyTorch?

@apaszke, @fmassa is this something you guys would be interested in? Or, is the focus of PyTorch more leaning towards experiments/applications?

1 Like

If anyone knows of any good theory repos/projects that are in any deep learning framework, (doesn’t have to be PyTorch). Please could you link them here?

I’d be interested in implementing them in PyTorch - I’ll be doing some DL theory teaching/demos over the summer.

This would also be very helpful to others who are giving DL demos/teaching.

Thanks a lot for your help,

Aj

definitely not learning theory, but a notebook that teaches NLP properly using pytorch as a tool, rather than showcase pytorch as a tech-demo was this:
https://github.com/rguthrie3/DeepLearningForNLPInPytorch

In general, to give you my completely honest and regularized opinion, teaching something like “Learning Real and Boolean Functions” is way more valuable to teach in numpy than in pytorch. I wont pretend, numpy is much more accessible.

Unless you need to really teach / showcase a gradient based learning method (SVM-SGD?) or teach GPU based whatever, you can maybe just stick to numpy.

2 Likes

On my list of things I’d like to see or do is an implementation of
Alemi et al.: Deep Variational Information Bottleneck
or
Shwartz-Ziv and Tishby: Opening the Black Box of Deep Neural Networks via Information

Also, I think the Bayesian neural network chapters in the textbook
MacKay: Information Theory, Inference, and Learning Algorithms
might be inspiration for some things you could do. Though for this, Soumith’s comment regarding numpy vs. torch may apply as well.

Naturally, I’d be very interested in what you do. :wink:

Best regards

Thomas

3 Likes

I know this is an old conversation but I am developing a full fledged package for information theory of deep learning in PyTorch which have a lot of information bottleneck functionalities including HSIC bottleneck sigma networks (yes ! that train without backprop). Currently library is in testing phase but there are notebooks available, thought that can be useful.
Link to the repo:


Link to the documentation:
https://pyglow.github.io/

1 Like