Why all out of a sudden google colab runs out of memory?

Hello again . Sorry I was just wondering instead of using .detach () or .item() cant I just use :

with torch.no_grads:
        x=layer(x)
        max_=[img_.max()for chanel in x for img_ in chanel]
        min_=[img_.min()for chanel in x for img_ in chanel]```

instead of :
    x=layer(x)
    max_=[img_.max().detach().numpy() for chanel in x for img_ in chanel]
    min_=[img_.min().detach().numpy() for chanel in x for img_ in chanel]```

Cause acording to the documentation :
Context-manager that disabled gradient calculation.
##############
Disabling gradient calculation is useful for inference, when you are sure that you will not call Tensor.backward(). It will reduce memory consumption for computations that would otherwise have requires_grad=True.
##############
I dont knowi if it removes entire computation graph or not ?(which was the problem causing the increas of memory usage)
I have tested this out and apparently it takse less memory . But I am not just sure