Get the latest tech news

The Math Behind GANs (2020)


Generative Adversarial Networks refer to a family of generative models that seek to discover the underlying distribution behind a certain data generating process. This distribution is discovered through an adversarial competition between a generator and a discriminator. As we saw in an earlier introductory post on GANs, the two models are trained such that the discriminator strives to distinguish between generated and true examples, while the generator seeks to confuse the discriminator by producing data that are as realistic and compelling as possible.

\[\min_G \max_D \{ \log(D(x)) + \log(1-D(G(z))) \} \tag{9}\] The min-max formulation is a concise one-liner that intuitively demonstrates the adversarial nature of thecompetition between the generator and the discriminator. On the other hand, for a generated sample $x = G(z)$, we expect the optimal discriminator to assign a label of zero, since $p_{data}(G(z))$ should be close to zero. This conclusion certainly aligns with our intuition: we want the generator to be able to learn the underlying distribution of the data from sampled training examples.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of math

math

Photo of GANs

GANs

Related news:

News photo

Should you buy a refurbished iPad in 2025? I did the math so you don't have to

News photo

How much RAM does your PC really need in 2025? I did the math for Windows and Mac users

News photo

Is a refurbished MacBook viable in 2025? I did the math, and here's my expert advice