What is the "hello world" of GANs?


I’ve been wanting to get to know better GANs and I was wondering what could be a good entry point exercise. What do you think of a setting where the adversarial network would learn to generate sinus ?

You’d have various types of sinus, with different frequencies, different offsets described with, say, 256 points (neurons) and the adversarial network should learn the distribution of sinus. Does it sound logical, coherent, relevant ?
Thanks !

https://arxiv.org/abs/1701.00160 here something to start with

1 Like

Thanks, I’ll read that ! Do you have any experience with them ?

Just a bit, I am interested in them and have played a bit with, but havent really had time to master the subject

Here is a very simple toy example for a mixture density network and SLOGAN.

There are also 8 Gaussians and friends, in pytorch eg in [thus improved WGAN notebook].(https://github.com/t-vi/pytorch-tvmisc/blob/master/wasserstein-distance/Improved_Training_of_Wasserstein_GAN.ipynb)

It is a neat topic you’ll enjoy exploring it.

Best regards