Usage of shared_cache in

Hey Everyone

i have a question on the usage of shared_cache storage in “torch/multiprocessing/”.
specifically in the routine def rebuild_storage_fd(cls, df, size):

if i disable cache lookup “storage = storage_from_cache(cls, fd_id(fd))” in rebuild_storage_fd(cls, df, size).
it will end up allocating new storage “cls._new_shared_fd(fd, size)” instead of reading from the cache.

will disabling the shared_cache lookup have any other side effect on training job & its outcome?
are there any other usage of the shared_cache lookup in rebuild_storage_fd(cls, df, size) apart from data loader scenario?