GithubHelp home page GithubHelp logo

training time about dagan HOT 4 CLOSED

antreasantoniou avatar antreasantoniou commented on July 29, 2024
training time

from dagan.

Comments (4)

HamzahNizami avatar HamzahNizami commented on July 29, 2024

Hi there, any tips on how to train my own dataset. Thank you

Thank you very much for sharing your code. I am learning your method, but can you tell me how long it takes to run and train two data sets? I'm not sure if there was an error during the run, but it has taken a lot of time.
look forward to your reply.

from dagan.

GuChenghs avatar GuChenghs commented on July 29, 2024

thank you for your reply.
I used the same parameters to run the program on the dataset omniglot twice, but the result is this. I am sorry but I am a newbie at machine learning. I hope that you could explain the meaning of the parameters and results.
ex1
ex2

from dagan.

AntreasAntoniou avatar AntreasAntoniou commented on July 29, 2024

@GuChenghs: The time required to train a DAGAN model depends on the dataset that you want to train a model on, as well as the hardware you are using to train your model. The results you are observing showcase the generator and critic losses over iterations. It appears that you have trained the model for 197 iterations, which is nowhere near enough to train a good DAGAN on Omniglot. You'll need to train the model for at least 20 * 500 iterations before you begin to see good results. Now as far as interpreting the results. The most important metric is the total_d_train_loss and total_d_val_loss as it correlates with sample quality. If you want to read further on what those losses indicate, you should at the very least read the Wasserstein GAN and Improved Wasserstein GAN papers to get some understanding of how the particular adversarial losses used in DAGAN work in general.

from dagan.

GuChenghs avatar GuChenghs commented on July 29, 2024

Thank you very much for your quick reply.
I trained the model for 500 epochs according to your suggestion and got some results. I want to test the optimization effect of the DAGAN model on the classifier according to the method mentioned in the paper. Regarding the selection and training of the classifier, can you provide more information or code? I observed the generated data, but the details of the next training are not clear. How to use the data generated by the DAGAN model to train the DenseNet classifier, because I am doing my best to achieve the experimental results of the paper. In addition, is the training for the VGG-face dataset also 500 epochs?

from dagan.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.