That makes sense, thank you Frank! I was originally trying to create both the ReLu and linear layer when initializing the module list, but found this method more helpful if anyone is curious:
self.fc2 = nn.ModuleList()
for _ in range(initHiddenWeights.shape[0]):
self.fc2.append(nn.Linear(initHiddenWeights.shape[1], initHiddenWeights.shape[2]))
self.fc2.append(nn.ReLU())