Hi all,
I’ve searched the forum for my question but couldn’t find anything similar. Here’s my problem.
I have this architecture with 4 different nn.Sequential blocks (shared below). I’m trying to investigate the impact of latent_size with the values of 4, 8, and 16. As it is seen, “self.lf_model” does not depend on the input argument latent_size. However, the initial weights of “self.lf_model” changes with the changing latent_size even though I use seeding as torch.manual_seed(15451), random.seed(15451), and, np.random.seed(15451).
class Model(nn.Module):
def init(self, latent_size):
super(Model, self).init()
self.encoder = nn.Sequential(nn.Linear(in_features=4096, out_features=1024),
nn.Linear(in_features=1024, out_features=256),
nn.Linear(in_features=256, out_features=64),
nn.Linear(in_features=64, out_features=latent_size)
)
self.lf_model = nn.Sequential(nn.Linear(in_features=4096, out_features=512), nn.ReLU(),
nn.Linear(in_features=512, out_features=128), nn.ReLU(),
nn.Linear(in_features=128, out_features=16), nn.ReLU(),
nn.Linear(in_features=16, out_features=2)
)
self.lc_model = nn.Sequential(nn.Linear(in_features=latent_size+2, out_features=32), nn.Linear(in_features=32, out_features=2))
self.nlc_model = nn.Sequential(nn.Linear(in_features=latent_size+2, out_features=32), nn.ReLU(),
nn.Linear(in_features=32, out_features=32), nn.ReLU(),
nn.Linear(in_features=32, out_features=2))
Why do I get different initial weights for “self.lf_model”? How can I initialize them the same?
Thank you in advance!