# What does zero_grad() do in this case

lets say I have a network known as net1.

The optimizer for that network is defined as follows:

``````optimizer = torch.optim.Adam(net1.parameters()