GithubHelp home page GithubHelp logo

evaluate D about esrgan HOT 5 OPEN

xinntao avatar xinntao commented on May 29, 2024
evaluate D

from esrgan.

Comments (5)

xinntao avatar xinntao commented on May 29, 2024

I think that D cannot make absolutely correct predictions as D is also improving itself during training.
Typical losses of G and D can be found here: https://github.com/carpedm20/DCGAN-tensorflow (see Training details part)

I usually print the D outputs to see whether D is trained OK. If the outputs change suddenly or drastically, D usually behaviors badly.

from esrgan.

JimmyChame avatar JimmyChame commented on May 29, 2024

Hi, I have another doubt about GAN. @xinntao
When I use GAN and perceptual loss to finetune my network, which is pre-trained with mse loss, I save the outputs of the generator during training. Then I observe that the outputs become textured, then smooth, then textured, and so on. Is this a normal phenomnon during training ? Is the best result achieved at the end of training ? Thank you !

from esrgan.

xinntao avatar xinntao commented on May 29, 2024

@JimmyChame
At first of training, it will fluctuate a lot. With training, the outputs will be more stable.

from esrgan.

JimmyChame avatar JimmyChame commented on May 29, 2024

Thanks for your reply. I have another question if it doesn't interrupt. @xinntao
I check the losses you mentioned in DCGAN-tensorflow, but I have a little doubt about them.
In the ideal case, the discriminator cannot distinguish between real and generated images at the end of training. The outputs of discriminator will be around 0.5, which means both of real and fake logits are 0. So I calculate the losses myself as following:

real = tf.ones_like(input)*0
fake = tf.ones_like(input)*0
real_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=real, labels=tf.ones_like(real)))
fake_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=fake, labels=tf.zeros_like(fake)))
d_loss = real_loss+fake_loss
g_loss = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=fake, labels=tf.ones_like(fake)))

Then I get the d_loss and g_loss values which are 1.3855876 and 0.6927938 separately. I found a large deviation between the value in DCGAN-tensorflow and the one I calculated. Am I wrong, or is this just an ideal case ?

from esrgan.

xinntao avatar xinntao commented on May 29, 2024

@JimmyChame It is just the ideal case.

from esrgan.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.