How to pass a "self" parameter?

Hi
I got struck at a point in my code(which might seems naive to others ) where I would like to pass a self parameter. The sample model structure is following:

class A(nn.Module):
    def __init__(selft):
      super(A, self).__init__()
      self.fc = nn.Linear(768*1, 1)    
      self.r=1.0      
    def forward(self, in):   
      fc_output=self.fc(in)-self.r
      return fc_output,self.r
    def updateR(r):
     r=r+1
def train():
    model.train()
    model.zero_grad()
    output_,r= model(inp)
    ##a code will be here to update the r. How could I update the r parameter from here??
    ## updateR=(r)
    loss = someLossFunction
    loss.backward()
    torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0)
    optimizer.step()

The sample driver code is following:

inp=someInput
device = 'cuda' if torch.cuda.is_available() else 'cpu'
model = A()
optimizer = torch.optim.AdamW(model.parameters(), lr=2e-5)
epochs = 1
current = 1
while current <= epochs:    
    train()
    current = current + 1

My objective is to find out a way to pass the self parameter in the train() to call the updateR() to update r. When I have tried doing in the following way I experienced error:

model.train()
and
def train(self):

Could you please advise me how could I pass the self parameter in the train method.

The self attribute refers to the module itself and is used to define a function as a member method.
If you want to manipulate r in updateR, use:

def updateR(self):
    self.r = self.r + 1
1 Like

ptrblck Hi

Actually, I got struck at how to call this updateR() from inside the train(). Should I pass the self parameter in the train()?

After you’ve defined updateR as a member function of the model:

class A(nn.Module):
    def __init__(selft):
      super(A, self).__init__()
      self.fc = nn.Linear(768*1, 1)    
      self.r=1.0      
    def forward(self, in):   
      fc_output=self.fc(in)-self.r
      return fc_output,self.r
    def updateR(self):
      self.r = self.r + 1

you can call it via:

def train():
    model.train()
    model.zero_grad()
    output_,r= model(inp)
    mode.updateR()
1 Like