In the BicycleGAN paper (
https://arxiv.org/pdf/1711.11586.pdf ), the Pytorch implemented code has the following function implemented:
layers += [nl_layer()]
layers += [convMeanpool(inplanes, outplanes)]
self.conv = nn.Sequential(*layers)
self.shortcut = meanpoolConv(inplanes, outplanes)
def forward(self, x):
out = self.conv(x) + self.shortcut(x)
def __init__(self, input_nc=3, output_nc=1, ndf=64, n_blocks=4,
norm_layer=None, nl_layer=None, vaeLike=False):
self.vaeLike = vaeLike
max_ndf = 4
conv_layers = [
nn.Conv2d(input_nc, ndf, kernel_size=4, stride=2, padding=1, bias=True)]
for n in range(1, n_blocks):
input_ndf = ndf * min(max_ndf, n)
output_ndf = ndf * min(max_ndf, n + 1)
Could anyone confirm with me if the forward function is returning
output as a 1x1 dimensional vector representing the point estimate of the predicted mean?
If so, in the following function, is this value of mean i.e.
mu broadcasted to a |z| dimensional vector using the reparameterization trick?
self.real_B = input['B' if AtoB else 'A'].to(self.device)
self.image_paths = input['A_paths' if AtoB else 'B_paths']
def get_z_random(self, batch_size, nz, random_type='gauss'):
if random_type == 'uni':
z = torch.rand(batch_size, nz) * 2.0 - 1.0
elif random_type == 'gauss':
z = torch.randn(batch_size, nz)
def encode(self, input_image):
mu, logvar = self.netE.forward(input_image)
std = logvar.mul(0.5).exp_()
eps = self.get_z_random(std.size(0), std.size(1))
z = eps.mul(std).add_(mu)
return z, mu, logvar
def test(self, z0=None, encode=False):
if encode: # use encoded z
z0, _ = self.netE(self.real_B)
You could add print statements to the methods in question and print the shape of the tensors as well as parameters.
Let me know, if I misunderstood the question.
Yes I was thinking of cloning the repo but I am afraid my laptop might not have the required specs or might hang during the training. I am eagerly waiting for codespaces on GitHub for that. I shall try to reduce the training set size. Could you figure out from the snippet posted above about the shape of the