Half Precision BackProp Taking up More RAM than Float

I have a model that takes approx 8gb RAM. When I convert the model to half via model.half() the RAM blows up to 48GB during backward().

Has anyone encountered a similar problem?

Hi,

Do you run this on CPU?
I don’t think the CPU actually has half precision implementation and so it falls back to single precision float. This might explain the memory increase :confused:

Hi,

It is running on GPU Tesla V100. It is running fine as float. Get’s in just under 16GB. But when I switch to half it blows up during the backward pass. The memory goes up to 48GB then.

The CPU memory or the GPU memory? V100 don’t have 48GB of memory right?

GPU memory. I get an errror message saying the GPU tried to allocate 48GB RAM and then crashes as it only has 16GB.

Could you post your model definition here so that we could reproduce this behavior, please?