I’m new to glow and am trying to understand exactly what it does and how it can be used.
From reading the paper I get the impression that normal pytorch code can be compiled an optimized, but from what I see, the closest thing to this is the torch_glow library. Is this the case or have I missed something?
Hi @kempj, Glow has multiple front ends. That is, we can load models from ONNX or Caffe2 protobufs, or through TorchScript from PyTorch (torch_glow). Or you could even write your model directly in C++ with our C++ APIs if you wanted .
torch_glow is under active development/is less mature than our ONNX and Caffe2 protobuf loaders. If you have problems with torch_glow, another path from PyTorch to Glow is to export your model to ONNX and then load into Glow.
It’s definitely still under active development. It’s gotten more mature, but not sure if it’s easily buildable still. You can see the latest build instructions from a couple months ago here.