Discriminator Accuracy

Hi,

I am using GAN model, and I want to calculate the discriminator’s accuracy on both real samples and fake ones. Could someone helps me please?

Thank you,

1 Like

You could calculate the accuracy as with any other classification model. Have a look at e.g. the ImageNet example.

1 Like

what if I am using an unsupervised learning, how can I calculate the discriminator accuracy?

I don’t know how the accuracy calculation would work if no labels are provided.

what about the below function, is it correct:

d_acc = 0
d_acc += (fake_imgs == real_imgs).float().sum()
accuracy = 100*d_acc /len(train_dataloader)

There is another way to evaluate the discriminator in unsupervised learning?

I don’t think comparing image data directly might work well for a lot of use cases.
Especially if you are using floating point values, you would certainly need to use a small eps value in the comparison. Also, you would have to think about what the “accuracy” really represents.
E.g. assuming you are working on a use case where a fake image should be “close” to the input image: would it matter if the generated fake image is shifted by a single pixel (potentially low “accuracy”) or would the overall “quality” of the image matter more?

How can I calculate the accuracy based on the predicted scores of the discriminator?