F.leaky_relu blows up GPU

Hello. I was using leaky_relu and found that leaky_relu causes GPU memory leak. The training can run for only several iterations and encounter out of memory error. When I tried relu, the training ran usually using around 7GB. I am wondering if it is a bug in leaky_relu?

PS: it seems that the memory leak happens only on Windows. My code with lReLU works fine on Linux .

It could be a bug, but it’s strange that the memory leak doesn’t show up on windows. If you run leaky_rely in a for loop, does your gpu memory usage keep going up?

Yes i ran in an usual for loop for training.

Are you using 0.3.1? Please update to 0.4.0 because the memory leak has been solved.