Question related to Tensor functions

Hey, I wanted to uunderstand a little bit more about the following functions in the torch.Tensor class :

add_(value=1, other) → Tensor, If I active weights.add_(learning_rate,gradient), then it should add to the weights (learning_rate*gradient) ? or it would sum both learning rate and gradient ? In addition, does the changes apply to the tensor that run the function (in this case weights is a tensor) or it will return a new tensor ?

addcdiv ( value=1 , tensor1 , tensor2 ) → Tensor : it will add c to the current tensor and devide by which of the tensors (1/2 ) ?


Found an answer :
torch. addcdiv ( input , value=1 , tensor1 , tensor2 , out=None ) → Tensor
Performs the element-wise division of tensor1 by tensor2 , multiply the result by the scalar value and add it to input

torch. add ( input , other , alpha=1 , out=None )

Each element of the tensor other is multiplied by the scalar alpha and added to each element of the tensor input .

\text{out} = \text{input} + \text{alpha} \times \text{other}