This is the official release of "DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization" ECCV 2022.
Our method can achieve faster distillation training on ImageNet1K
pip install -r requirements.txt
python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 40
python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 240 \
--alpha_normalization_style 333
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
@inproceedings{deng2022distpro,
title={DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization},
author={Xueqing Deng and Dawei Sun and Shawn Newsam and Peng Wang},
journal={ECCV},
year={2022}
}