Hi!
So I have a model in pytorch that looks like this:
class LAS(nn.Module):
def __init__(self, listener, speller):
super(LAS, self).__init__()
self.listener = listener
self.speller = speller
def forward(self,input):
input = self.listener(input)
input = self.speller(input)
return input
Where listener and speller are two instances of a class inheriting from nn.Module as well:
class Speller(nn.Module) #Each of them have their layers and neurons but I will omit them for clarity
class listener(nn.Module)
I know I can do LAS.speller.parameters() to access the speller parameters but I would like to combine both so I can pass then to my optimizer like this:
optimizer = torch.optim.Adam(LAS.parameters(),
lr=params["training"]["lr"],
)
Instead that doing it like this:
optimizer = torch.optim.Adam(
[{"params": LAS.listener.parameters()}, {"params": LAS.speller.parameters()}],
lr=params["training"]["lr"],
)
When I do LAS.parameters() will it be equal to actually passing both listener
& speller
parameters to the optimizer separately?