Hey, I wanted to uunderstand a little bit more about the following functions in the torch.Tensor class :
add_(value=1, other) → Tensor, If I active weights.add_(learning_rate,gradient), then it should add to the weights (learning_rate*gradient) ? or it would sum both learning rate and gradient ? In addition, does the changes apply to the tensor that run the function (in this case weights is a tensor) or it will return a new tensor ?
addcdiv ( value=1 , tensor1 , tensor2 ) → Tensor : it will add c to the current tensor and devide by which of the tensors (1/2 ) ?