I return all activations like this:
def forward(self, input):
x = input
allActivations = {}
for i in range(self.n_layers):
x = getattr(self, 'layer_' + '{0:02d}'.format(i))(x)
allActivations['layer_' + '{0:02d}'.format(i)] = x
return x, allActivations
No I want to set them to a different value like this:
with torch.no_grad():
for key, value in allActivations.items():
value.div_(someValue)
Do I need to call torch.no_grad or not? Thanks for your help!