I understand that self.fc1(),self.fc2()… are referring to the fully connected layers one and two which are both Linear in my example, but what is self.fc_mu referring to, and what does it return? Also, what are the parameters of self.fc_mu(), and self.fc_sigma() which I saw used many times? Thank you in advance, I am just struggling to figure this out.

self.fc_mu and self.fc_sigma are just the attribute names for both linear layers.
Their meaning depends on the context. In this case they might be used to apply the “reparametrization trick”.

for m in self.modules():
if isinstance(m, nn.Linear):
nn.init.kaiming_normal_(m.weight)
nn.init.constant_(m.bias, 0.0)
def forward(self, x):
x = F.leaky_relu(self.fc1(x), negative_slope=2e-1)
x = F.leaky_relu(self.fc2(x), negative_slope=2e-1)
mu = self.fc_mu(x)
sigma = torch.sigmoid(self.fc_sigma(x))
x = torch.sigmoid(F.leaky_relu(self.fc3(mu + sigma * torch.randn_like(sigma)), negative_slope=2e-1))
return x, mu, sigma

What would this entail? And what is the reparametrization trick entailing?