Newbie question: cannot use `save_for_backward`

After having ridden the how to about extending autograd, I’m trying to do a little experiment:

from torch.autograd import Function
import torch

class MyCustomOp(Function):                            
                                                                
    @staticmethod                                               
    def forward(ctx, mat):                                      
        indices = mat.sum(2)                                    
        ctx.save_for_bacward(mat.shape, indices)                
        # just return something and save stuffs for later

        return indices                                          
                                                                
    @staticmethod                                               
    def backward(ctx, grad_indices):                            
        mat_shape, indices = ctx.saved_tensors                  
                                                                
        # just some random computation using saved tensors      
        grad_mat = indices[..., None].expand(mat_shape)         
        return grad_mat                                         

MyCustomOp.apply(torch.rand(10, 10, 10, 10))

However, I get: AttributeError: 'MyCustomOpBackward' object has no attribute 'save_for_bacward'

What’s wrong?

Torch version 1.4.0.

Hi,

You’re missing k in save_for_backward :slight_smile:

Also keep in mind that you should use save_for_backward() only for input or output Tensors. Other intermediary Tensors or input/output of other type can just be saved in the ctx as ctx.mat_shape = mat.shape in your case.

I’m not getting it, sorry…

Ah, good news, thanks for that!

In your code you use save_for_bacward, it should be save_for_backward.

Ah :cowboy_hat_face: