Can backward function decrease memory of gpu?

def backward_theta(self, x, y):
        self.vgg16.train()
        loss, prob_trees = self.get_loss(x, y)
        self.optimizer.zero_grad()
        loss.backward()
        self.optimizer.step()

This code above can run normally, but the code below raise the error “CUDA out of memory”. Is something wrong in the else place?

def backward_theta(self, x, y):
        self.vgg16.train()
        loss, prob_trees = self.get_loss(x, y)
        self.optimizer.zero_grad()
        #loss.backward()
        self.optimizer.step()

Hi,

The backward pass will clear all the buffers saved in the computational graph. So yes it can do that.
If you want to do the same, you need to make sure all references to the computational graph are gone. so del loss and del prob_trees here most likely.

Thx a lot!!!