Caffe2 vs pytorch


(Shaun) #1

Hi,

I understand that both caffe2 and pytorch has support from facebook. I have a few questions about them:

  1. What are the main differences between both the libraries?
  2. Is one better than the other in certain aspects i.e., would we chose one over the other based on the problem domain?
  3. Would pytorch continue to be actively developed or is there a direction where it would be “merged” within caffe2?

Thanks.


(Yun Chen) #2

Answers to most of your questions can be find in reddit.


(Shaun) #3

I’ve seen this phrase “for research and for industrial” (nltk vs spacy) thrown around a lot. What is the difference between the two paradigms? If I work in industry why wouldn’t I want to use pytorch and vice versa. I think this is was mentioned by the author in the comments that the lines get blurred often:

Yangqing here. Caffe2 and PyTorch teams collaborate very closely to deliver the fastest deep learning applications as well as flexible research, as well as creating common building blocks for the deep learning community. We see Caffe2 as primarily a production option and Torch as a research option, but of course the line gets blurred sometimes and we bridge them very often.
We also adopt the idea of “unframework” - in the sense that we focus on building key blocks for AI. Gloo, NNPACK, and FAISS are great examples of these and they can be used by ANY deep learning frameworks. On top of these, we use lightweight frameworks such Caffe2 and PyTorch for extremely agile development in both research and products.

but I’m still not clear why and when should I use which one. Especially since there are python bindings available for caffe2 as well.


#4

Yeah I also read an article on Caffe2 by NVIDIA with Facebook. I was wondering which one would be better, Caffe2 or PyTorch. And, if anybody is beginner like me, then which one should be preferred.


(Shaun) #5

I am by no means an expert, but I think pytorch is a bit ahead than Caffe2 and it would be a good starting point. My question is I (and I would guess many others from reading the comments) can’t find a clear line of distinction between two libraries other than “caffe2 is for industry and pytorch is for research”. And I don’t really know what that means. I hope the developers of either (or both?) can pitch in.


(Yun Chen) #6

Here is my personal opinion, I’m not an expert either.

  • Caffe2 is superior in deploying because it can “CODE ONCE, RUN ANYWHERE”, It can be deployed in mobile, which is really appealing and it’s said to be much faster than other implementation.
  • PyTorch is super elegant and flexible, it can be used like tensorfow (low level), it can also be used like keras(which reference a lot from the torch), and it could do what they can’t because it’s dynamic.
  • In research, we need to experiment a lot, debug a lot, adjust parameter, try latest wired model architecture, build our own special network. PyTorch is super qualified and flexible for these tasks.
  • when deploying, we care more about a robust universalizable scalable system. Amazon, Intel, Qualcomm, Nvidia all claims to support caffe2.
  • the line gets blurred sometimes, caffe2 can be used for research, PyTorch could also be used for deploy.
  • caffe2 are planning to share a lot of backends with Torch and PyTorch, Caffe2 Integration is one work in PyTorch(medium priority), we can export PyTorch nn.Module to caffe2 model in future.
  • if you are a beginner want to learn deeplearning/framework, use PyTorch. You’ll enjoy it.

(李作刚) #7

Like this?


(Yun Chen) #8

Yes, kind of :grinning:


(Ajay Talati) #9

Hi Shaun @shaun, if you’re interested in embedded’s this is a nice read :smile:

Facebook and Qualcomm Announce Collaboration to Support Optimization of Caffe2 and Snapdragon NPE


(Herleeyandi Markoni) #10

Is there any docker image which contains both of pytorch and caffe2?, I am little bit lazy to install caffe2 in my machine :smiley:.


#11

The ONNX docker image has both: https://github.com/onnx/onnx#docker


(ngimel) #12

1 GB libTHC! What architectures are you compiling for?

root@b4764712536d:~/programs/pytorch# find / -name libTHC"*" | xargs ls -lrt
-rw-r--r-- 1 root root  273196496 Sep 12 02:37 /opt/conda/lib/python2.7/site-packages/torch/lib/libTHCUNN.so.1
-rw-r--r-- 1 root root    6178128 Sep 12 02:37 /opt/conda/lib/python2.7/site-packages/torch/lib/libTHCS.so.1
-rw-r--r-- 1 root root 1006545712 Sep 12 02:37 /opt/conda/lib/python2.7/site-packages/torch/lib/libTHC.so.1

#13

i think @houseroad didn’t add the relevant binary flags, and Xcompress stuff. I’ll let him know.


(Houseroad) #14

With some compress flags, libTHC got reduced to around 260MB. The docker images have been updated.