Facebook DLRM model inference using pytorch 2.0 compilation method

Hi,

Does anyone tried Facebook DLRM 10GB,100GB model with pytorch 2.0?

I was trying it but it got stuck, not giving any final output.10GB waited for almost 4hrs but no output.

But if try the same code with small models,i,e less embeddingbag with lesser sizes i can see the output.

Trying cpu flow using inductor as backend.