Use flexattention with torchrec

Hi Team

Do we have examples to use flexattention with torchrec? I tried by myself but got following errors.

[rank0]: torch._dynamo.exc.BackendCompilerFailed: backend=‘inductor’ raised:
[rank0]: IndexError: tuple index out of range

[rank0]: Set TORCH_LOGS=“+dynamo” and TORCHDYNAMO_VERBOSE=1 for more information

[rank0]: You can suppress this exception and fall back to eager by setting:
[rank0]: import torch._dynamo
[rank0]: torch._dynamo.config.suppress_errors = True

Thanks.

Found it is not related with torchrec. please ignore.