GithubHelp home page GithubHelp logo

Comments (4)

huanghoujing avatar huanghoujing commented on July 20, 2024

GL-NF only uses Triplet Loss with Global distance. The difference to open-reid triplet loss example includes

  • whether to use standard Market1501 train/test split, false in open-reid, but true here
  • the type of data augmentation, mirror + cropping in open-reid, only mirror here
  • batch size may be different, here 128 images
  • whether to use an extra embedding layer (128-dim) after average pooling (2048-dim), used in open-reid, but here not
  • whether to normalize the feature before computing triplet loss and for testing, done in GL-NF. This has little influence when triplet loss is only trained with global distance.
  • the epoch at which the learning rate starts to decay, epoch 100 for open-reid and 75 here

Thanks :)

from alignedreid-re-production-pytorch.

marvis avatar marvis commented on July 20, 2024

Thanks for sharing your training details!

However I have different experiment results comparing with your fifth item. After feature normalization before computing triplet loss, the CMC score decreases ~8%. And I find that with or without extra embedding layer doesn't affect the accuracy. So I doubt whether normalization caused the different baseline result?

Best,

from alignedreid-re-production-pytorch.

huanghoujing avatar huanghoujing commented on July 20, 2024
  1. I found that for Global Distance + Triplet Loss, i.e. the commonly used triplet loss paradigm, normalizing feature or not has little influence.
  2. Without normalization, I found it ok to train Global Distance + Local Distance + Triplet Loss. So later experiments only consider not normalizing features.
  3. By decreases ~8%, do you mean GL-LL-NF-LHSFLD-TWLD is worse than GL-LL-NNF-LHSFLD-TWLD by ~8 CMC Rank-1 points?
  4. BTW, the percentage of satisfying margin, i.e. distance(anchor, positive) + margin < distance(anchor, negative) is a good signal to check the status of convergence. It should be at least over 90% for both global and local distance. Do you find something abnormal in this signal at the end of training?

from alignedreid-re-production-pytorch.

huanghoujing avatar huanghoujing commented on July 20, 2024

With more experiments, I find that for vanilla triplet loss, with and without normalizing feature indeed makes difference, the later being better. Some results are in the provided Excel file.

Thank you for discussion.

from alignedreid-re-production-pytorch.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.