Getting flat curves when using torch.use_deterministic_algorithms

I’m working on YOLOv5 repo and we’re trying to make the training runs reproducible. Only setting random seeds doesn’t work so we’re trying to experiment with torch.use_deterministic_algorithms. Using it makes the experiments reproducible in terms of losses but all the validation metrics are always flat at 0.
The PR with changes
Dashboard with losses - Notice how all the losses decrease over time but the metrics/ section stays flat.
Here’s the init seed function:

def init_seeds(seed=0):
    # Initialize random number generator (RNG) seeds https://pytorch.org/docs/stable/notes/randomness.html
    # cudnn seed 0 settings are slower and more reproducible, else faster and less reproducible
    import torch.backends.cudnn as cudnn
    torch.use_deterministic_algorithms(True, warn_only=True)
    os.environ['PYTHONHASHSEED'] = str(seed)
    os.environ['CUBLAS_WORKSPACE_CONFIG'] = ':4096:8'
    random.seed(seed)
    np.random.seed(seed)
    torch.manual_seed(seed)
    # https://pytorch.org/docs/stable/_modules/torch/cuda/random.html#manual_seed
    torch.cuda.manual_seed(seed)
    torch.cuda.manual_seed_all(seed)  # for multi GPU. Exception safe
    cudnn.benchmark, cudnn.deterministic = (False, True) if seed == 0 else (True, False)