Limiting PyTorch Worker Thread Memory Usage

I’ve noticed that each of my worker threads in PyTorch consumes up to 4% of memory, and I have 10 worker threads running. This adds up quickly, and I’m concerned about excessive memory usage.

To be on the safe side, is there a way to limit the memory usage of worker threads in PyTorch?