Does initializing loss with torch.zeros() break the graph?

Hi, suppose I initialize my loss with zeros and then fill the tensor elements by looping over predictions and targets. Will this break the graph or the gradients will be calculated fine?