GithubHelp home page GithubHelp logo

jcruan519 / idat Goto Github PK

View Code? Open in Web Editor NEW
10.0 1.0 0.0 61 KB

(ICME24) This is the offical repository of iDAT: inverse Distillation Adapter-Tuning.

License: Apache License 2.0

Python 91.92% Shell 8.08%

idat's Introduction

iDAT: inverse Distillation Adapter-Tuning

This is the offical repository of iDAT: inverse Distillation Adapter-Tuning, which is accepted by ICME 2024. {Arxiv Paper}

Abstract

Adapter-Tuning (AT) method involves freezing a pre-trained model and introducing trainable adapter modules to acquire downstream knowledge, thereby calibrating the model for better adaptation to downstream tasks. This paper proposes a distillation framework for the AT method instead of crafting a carefully designed adapter module, which aims to improve fine-tuning performance. For the first time, we explore the possibility of combining the AT method with knowledge distillation. Via statistical analysis, we observe significant differences in the knowledge acquisition between adapter modules of different models. Leveraging these differences, we propose a simple yet effective framework called inverse Distillation Adapter-Tuning (iDAT). Specifically, we designate the smaller model as the teacher and the larger model as the student. The two are jointly trained, and online knowledge distillation is applied to inject knowledge of different perspective to student model, and significantly enhance the fine-tuning performance on downstream tasks. Extensive experiments on the VTAB-1K benchmark with 19 image classification tasks demonstrate the effectiveness of iDAT. The results show that using existing AT method within our iDAT framework can further yield a 2.66% performance gain, with only an additional 0.07M trainable parameters. Our approach compares favorably with state-of-the-arts without bells and whistles.

0. Main Environments

  • Create a conda virtual environment and activate it:
conda create -n iDAT python=3.8 -y
conda activate iDAT
  • Install requirements:
pip install -r requirements.txt

1. Prepare the dataset

VTAB-1K: You can follow SSF to download them, or download directly through this link (Baidu Netdisk).

2. Prepare the pre_trained weights

For pre-trained ViT models on ImageNet-21K, the weights will be automatically downloaded. You can also manually download them from ViT.

3. Fine-tuning within our iDAT framework

To fine-tune a pre-trained ViT model via Adapter within our iDAT framework on VTAB-1K, run:

bash train_scripts/vit/train_vtab_onlinekd.sh

4. Acknowledgement

Thanks for the open-source code from SSF.

idat's People

Contributors

jcruan519 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.