I've seen this phrase "for research and for industrial" (nltk vs spacy) thrown around a lot. What is the difference between the two paradigms? If I work in industry why wouldn't I want to use pytorch and vice versa. I think this is was mentioned by the author in the comments that the lines get blurred often:
Yangqing here. Caffe2 and PyTorch teams collaborate very closely to deliver the fastest deep learning applications as well as flexible research, as well as creating common building blocks for the deep learning community. We see Caffe2 as primarily a production option and Torch as a research option, but of course the line gets blurred sometimes and we bridge them very often.
We also adopt the idea of “unframework” - in the sense that we focus on building key blocks for AI. Gloo, NNPACK, and FAISS are great examples of these and they can be used by ANY deep learning frameworks. On top of these, we use lightweight frameworks such Caffe2 and PyTorch for extremely agile development in both research and products.
but I'm still not clear why and when should I use which one. Especially since there are python bindings available for caffe2 as well.