GithubHelp home page GithubHelp logo

flhonker / awesome-knowledge-distillation Goto Github PK

View Code? Open in Web Editor NEW
2.4K 2.4K 335.0 468 KB

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

deep-learning distillation kd knowldge-distillation model-compression transfer-learning

awesome-knowledge-distillation's People

Contributors

akashgupta-vimaanrobotics avatar cardwing avatar fawazsammani avatar flhonker avatar fmthoker avatar forjiuzhou avatar jaywonchung avatar lioutasb avatar pyjulie avatar roymiles avatar shivmgg avatar sinp17 avatar ta012 avatar winycg avatar zainzhao avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

awesome-knowledge-distillation's Issues

about kd paper in rec sys

Thank u for your contribution.
Cloud i create a new branch that add a category “kd in rec”?
thank u again!

Correction for "Awesome-Knowledge-Distillation"

Hi, Thank you for listing our paper "Knowledge Distillation Beyond Model Compression.", we really appreciate your effect.

I would like to request correcting the authorship of the paper, it should be Sarfraz et. al.

Also, our work isn't related to pruning and quantization, I believe a more relevant position for it would be in "beyond"

Please add a relevant paper from CVPR 2024

Hi @FLHonker
I notice that there is no paper in 2024 listed yet. Could you please add a relevant paper from CVPR 2024 Highlight. The information is given below:

Title: Logit Standardization in Knowledge Distillation
Paper: https://arxiv.org/abs/2403.01427
Github: https://github.com/sunshangquan/logit-standardization-KD
Supplements: https://sunsean21.github.io/resources/cvpr2024_supp.pdf

The paper discusses the possibility of assigning temperatures distinctly between teacher/student and dynamically across samples. It then proposes a weighted Z-score logit standardization as a plug-and-play preprocess, capable of boosting the existing logit-based KD methods.
Thank you for your attention.

One more paper

Would you mind adding this paper to the "Applications of KD" section?

How many Observations are Enough? Knowledge Distillation for Trajectory Forecasting. Monti, Alessio et al. CVPR 2022

项目链接

感谢您的分享,
就提个建议,如果有项目链接就很更好了

Could you please include our paper in the "Knowledge from logits" section?

Hi,

My name is Marco Mistretta, and I am a researcher at MICC (Florence, Italy).
Thank you for creating this repository! It has been an invaluable resource for me.

Could you please add our ECCV24 paper, Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation?
It's the first time that unsupervised knowledge distillation has been applied to prompt learning in the few-shot setting!

Thank you for your time and consideration!

TITLE: Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation
CONFERENCE: ECCV24
PAPER: https://arxiv.org/abs/2407.03056
CODE: https://github.com/miccunifi/KDPL

One more paper

Could you help to add this paper belonging to structured distillation I think:

Yu, Lu, Vacit Oguz Yazici, Xialei Liu, Joost van de Weijer, Yongmei Cheng, and Arnau Ramisa. "Learning Metrics from Teachers: Compact Networks for Image Embedding." In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2907-2916. 2019.

Papers in 2022?

I find there are lacks of papers in 2022?
Continue to collect the related research published in 2022?

关于推荐

您好,感谢分享,感觉目前与推荐相关的paper较少,所以想问问您,是因为这部分内容比较新,还是说有比较严重的问题难以开展研究?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.