Using NCHW format as default for Glow

I’d like to ask if there is any way (or plan for) adding to Glow ability to use NCHW by default instead of NHWC? For now it is said in some places that Glow’s default is NHWC, but at the same time, many times methods to generate NCHW layout are used.

I strongly doubt that all of the backends will use NHWC, doing conversion instead - for example convertConvToNCHWConv. Such approach will significantly affect performance by introducing all of those Transpose operations. Instead it should be allowed from the very beginning to choose which format will be used. E.g. Tensorflow allows to chose how formatting will be done.

You’re right that many backends will use NCHW, but the conversions usually have no runtime cost – the graph optimizer completely eliminates transposes that cancel. I just brought up an experimental NCHW backend and no transposes were needed to run ResNet or ResNeXt :-).

1 Like

@Bert_Maher thanks for info. That sound’s really interesting. If I get it correctly low-level IR, which is input for my backend, should not have transpose operation (or significantly reduced number of those), while at the same time using properly formatted convolution.

So for example for convolution that works only with NCHW I should just use conversion approach from OpenCL and it should work with high and low level optimizations taking care of improving performance? That’s really impressive :slight_smile:

1 Like

Yep, that’s exactly right! The OpenCL approach is a good place to look for the NCHW. And please do let us know if you’re seeing transposes that aren’t eliminated with that approach; it’s always possible to improve the optimizer.

Hi @Bert_Maher. I’m trying to create an experimental NCHW backend like you did. For this, I reused the TensorLayout modules (.cpp, .h) from the OpenCL backend, but even in LIR I still find transpose operations and the Convolution operations in lenet-mnist are with the NHWC layout. What am I doing wrong or not doing? Thanks in advance.