I am trying to compute the Inception core for my GAN network.
but i have observed something very interesting. Why are we using only X
for the calculation i taught KL divergence was suppose to use two variables main distribution P(x)
and predicted Model Q(x)
.
def inception_score(X):
kl = X * ((X + eps).log() - (X.mean(0) + eps).log().expand_as(X))
score = np.exp(kl.sum(1).mean())
return score
but it looks like the KL divergence is between X
and its mean X.mean(0)
i.e if X
is the the generated image, then what’s actually going on?