GAN issue - ValueError: only one element tensors can be converted to Python scalars

I have the following Discriminator class for my GAN model:

class Discriminator(nn.Module):
	def __init__(self, image_size, conv_dim, output_dim, repeat_num):
		super(Discriminator, self).__init__()
		layers = []
		layers.append(nn.Conv2d(3, conv_dim, kernel_size=4, stride=2, padding=1))
		layers.append(nn.LeakyReLU(0.01))

		curr_dim = conv_dim
		
		for i in range(1, repeat_num):
			layers.append(nn.Conv2d(curr_dim, curr_dim*2, kernel_size=4, stride=2, padding=1))
			layers.append(nn.LeakyReLU(0.01))
			curr_dim = curr_dim * 2
        
		kernel_size = int(image_size / np.power(2, repeat_num))
        
		self.main  = nn.Sequential(*layers)
		self.conv1 = (nn.Conv2d(curr_dim, 1, kernel_size=3, stride=1, padding=1, bias=False))
		self.conv2 = (nn.Conv2d(curr_dim, output_dim, kernel_size=kernel_size, bias=False))
		self.sig   = (nn.Sigmoid())
		self.soft  = (nn.Softmax(dim=1))

	def dLoss(self, images, batch_size, alpha=1.0):
		assert 0 <= alpha <= 1

		loss = 0
		for i, img in enumerate(image):	
			C, H, W = img.size()[:3]
			x = img.view(1, C, H, W)
			D_x, D_y = Discriminator(x)
			...
        return loss

	def forward(self, x):
		h  = self.main(x)
		s  = self.conv1(h)
		so = self.conv2(h)
		out_s  = self.sig(s)
		out_so = self.soft(so)
		return out_s.flatten(start_dim=2).mean(dim=2), out_so.view(out_so.size(0), out_so.size(1))

For the line D_x, D_y = Discriminator(x) I am getting the following error:

D_x, D_y = Discriminator(x)
 File "/home/project/network.py", line 162, in __init__
kernel_size = int(image_size / np.power(2, repeat_num))
ValueError: only one element tensors can be converted to Python scalars

But I get this error only while calling Discriminator(x) from a method within the class. If I try to access it from a method in a different class, it doesn’t raise any error.
I am confused as to what exactly is causing this issue. Please help.

Hi shrbrh!

This is your problem. This line of code instantiates (creates an instance of)
Discriminator (but is being called with invalid initialization arguments).

You don’t want to define dLoss() inside of (as a method of) Discriminator.
(Note, dLoss() never uses the self argument you pass it.)

Outside of the definition of your Discriminator class you should do something
like:

# instantiate Discriminator
my_discriminator = `Discriminator (various_initialization_parameters)
...
x = img.view (1, C, H, W)
D_x, D_y = my_discriminator (x)   # use the instance

Best.

K. Frank

Thank you for clarifying this.