GPU leak on Google Colab

Hello, I use the following code to define my simple model:

def BlockConv(channels_in, channels_out):
    return nn.Sequential(
        nn.Conv2d(channels_in, channels_out, (3, 3), padding=1),
        nn.MaxPool2d(2),
        nn.ReLU()
    )
  
def BlockDeconv(channels_in, channels_out):
    return nn.Sequential(
        nn.Upsample(scale_factor=2),
        nn.Conv2d(channels_in, channels_out, (3, 3), padding=1),
        nn.ReLU()
    )
  
def BlockConvWithoutPool(channels_in, channels_out):
  return nn.Sequential(
        nn.Conv2d(channels_in, channels_out, (3, 3), padding=1)
    )

class Colorizer(nn.Module):
    def __init__(self):
        super().__init__()
        
        self.preconcat = nn.Sequential(
            BlockConv(1, 70),
            BlockDeconv(70, 64)
        )
        
        self.postconcat = nn.Sequential(
            BlockConvWithoutPool(65, 3),
            nn.Sigmoid()
        )
    
    def forward(self, x):
        h = self.preconcat(x)
        h = torch.cat((h, x), 1)
        h = self.postconcat(h)
        return h

But it eats all my gpu (80%/8gb) after first iteration of training (128x128 images) on loss.backward().
What have I done wrong?

You can check it on Google Colab

Solved. Too many channels on first conv layer.