Pytorch equivalant of tensorflow GradientTape

Hi,

I was wondering what the equivlent in pytorch of the following tensor flow is:

def compute_apply_gradients(model, x, optimizer):  
     with tf.GradientTape() as tape:    
          loss = compute_loss(model, x)  
     gradients = tape.gradient(loss, model.trainable_variables)  
     optimizer.apply_gradients(zip(gradients, model.trainable_variables))

many thanks,

chaslie

Hi,

You can look at the tutorials on the website. But that would be something like:

loss = compute_loss(model, x)
optimizer.zero_grad()
loss.backward()
optimizer.step()

hi albanD,

thanks for the reply. This is what i had assumed but i wasn’t sure if there were any switches that i had missed.

Hope you manage to avoid Covid-19.

chaslie