I was wondering if any repo’s or notebooks are in the eco-system which use PyTorch to demonstrate theory, rather than do experiments, (although the border between the two is a bit vague).
There’s a lot of beautiful theory on function approximation, for example,
@smth - Vladimir Vapnik is at FAIR now, (I think)? So I guess there might be some, “in house teaching courses”, that covers statistical learning theory, and is implemented using PyTorch?
@apaszke, @fmassa is this something you guys would be interested in? Or, is the focus of PyTorch more leaning towards experiments/applications?
If anyone knows of any good theory repos/projects that are in any deep learning framework, (doesn’t have to be PyTorch). Please could you link them here?
I’d be interested in implementing them in PyTorch - I’ll be doing some DL theory teaching/demos over the summer.
This would also be very helpful to others who are giving DL demos/teaching.
In general, to give you my completely honest and regularized opinion, teaching something like “Learning Real and Boolean Functions” is way more valuable to teach in numpy than in pytorch. I wont pretend, numpy is much more accessible.
Unless you need to really teach / showcase a gradient based learning method (SVM-SGD?) or teach GPU based whatever, you can maybe just stick to numpy.
Also, I think the Bayesian neural network chapters in the textbook MacKay: Information Theory, Inference, and Learning Algorithms
might be inspiration for some things you could do. Though for this, Soumith’s comment regarding numpy vs. torch may apply as well.
I know this is an old conversation but I am developing a full fledged package for information theory of deep learning in PyTorch which have a lot of information bottleneck functionalities including HSIC bottleneck sigma networks (yes ! that train without backprop). Currently library is in testing phase but there are notebooks available, thought that can be useful.
Link to the repo: