Empty_cache does not work in 0.4.0

Hi. I have a problem with memory management. Before in 0.3.1, empty_cache worked very well for my code but now it does not work efficiently anymore. My code outline is as follows

 for dataset in datasets:
     T.cuda.empty_cache()
     net = MyNet()
     while epoch < n_epochs:
         #training goes here

The memory still slightly increases after each iteration. Since the number of datasets is big, i think at some point the memory will blow up. Please have a look at the problem.

Hi,

The effect of empty_cache is not to prevent small increases in memory. It will just reduce the peak memory usage (at runtime speed cost).

Can you give a small code example to reproduce the steady memory increase please?