Quantization-aware training conv1D, LSTM support

I am just taking a look at using Quantization-aware training. At the moment I am making my way through the tutorials.

My current models make use of layers such as conv1D,and in some cases LSTM and GRU etc…

According to the documention this is not currently supported for those particular layer types. As a work around, one possibility would be to replace the conv1D with conv2D. I would convert the data to 2D (actually remains 1D, but 2D in dimensions).

With regards to LSTM, GRU, I am not yet sure.

1 Like

It seems this may partially answer my question.

@Paul_Creaser,
Conv1d is now available in nightly builds. There is LSTM available with dynamic quantization and GRU is currently not available. However, an RNN base class is available with dynamic quantization.

1 Like