Thank you for your replying.
When use the optimizer, like:
self.optimizer =torch.optim.SGD(self.model.parameters(), self.lr, momentum=0.9)
Is it not including the parameters of self.threshold
?
Thank you for your replying.
When use the optimizer, like:
self.optimizer =torch.optim.SGD(self.model.parameters(), self.lr, momentum=0.9)
Is it not including the parameters of self.threshold
?
It will be included and you can check it via:
module = SpikingBasicBlock(1, 1, 1, 1)
print(dict(module.named_parameters()))
print(list(module.parameters()))
Thank you so much for your time and your support!!
Hi ptrblck,
I also want to define a new tensor inside a torch.nn.module and I have a follow-up question:
I want to define a torch.nn.parameter with requires_grad=False in the constructor. Can the shape and content of the defined nn.parameter be changed or do I need to know the content and shape, when constructing my nn.module?
Thank you very much in advance!
If you create this parameter (or buffer, as it doesnât require gradients) in the __init__
method of the module, you would need to know the shape in advance.
However, since this tensor doensât require gradients and is thus not trainable, you could also lazily define this tensor e.g. in the forward
method, if you have the shape information by then.