GPU memory usage keeps constantly increasing throughout training

I noticed that, while training my model, the GPU usage constantly increases. The only different thing I do, compared to what I was already training on, is adding some different types of noise to my input batch

    idx_gaussian = np.random.choice(np.arange(4), 2, replace=False)

    lst = np.array([0, 1, 2, 3])
    idx_masking = np.setdiff1d(lst, idx_gaussian)

    noisy_input = input_tensor.data
    noisy_input = Variable(noisy_input)

    noisy_input[idx_masking[0]] = self.noise1(noisy_input[idx_masking[0]])
    noisy_input[idx_masking[1]] = self.noise1(noisy_input[idx_masking[1]])

    noisy_input[idx_gaussian[0]] = self.noise2(noisy_input[idx_gaussian[0]])
    noisy_input[idx_gaussian[1]] = self.noise2(noisy_input[idx_gaussian[1]])

and use the noisy_input tansor as my input. I tried adding the .data part since I read that I would be able to discard redundant info that way. This does, however not work, and I end up with an out of memory error about 20 epochs in.

if you can give a small snippet, i can investigate further.