GithubHelp home page GithubHelp logo

htqin / quantsr Goto Github PK

View Code? Open in Web Editor NEW
37.0 3.0 2.0 9.99 MB

[NeurIPS 2023 Spotlight] This project is the official implementation of our accepted NeurIPS 2023 (spotlight) paper QuantSR: Accurate Low-bit Quantization for Efficient Image Super-Resolution.

License: Apache License 2.0

Python 97.90% Shell 0.28% MATLAB 1.82%
model-quantization quantized-neural-networks super-resolution

quantsr's Introduction

QuantSR: Accurate Low-bit Quantization for Efficient Image Super-Resolution

This project is the official implementation of our accepted NeurIPS 2023 (spotlight) paper QuantSR: Accurate Low-bit Quantization for Efficient Image Super-Resolution [PDF]. Created by researchers from Beihang University and ETH Zürich.

loading-ag-172

Introduction

Low-bit quantization in image super-resolution (SR) has attracted copious attention in recent research due to its ability to reduce parameters and operations significantly. However, many quantized SR models suffer from accuracy degradation compared to their full-precision counterparts, especially at ultra-low bit widths (2-4 bits), limiting their practical applications. To address this issue, we propose a novel quantized image SR network, called QuantSR, which achieves accurate and efficient SR processing under low-bit quantization. To overcome the representation homogeneity caused by quantization in the network, we introduce the Redistribution-driven Learnable Quantizer (RLQ). This is accomplished through an inference-agnostic efficient redistribution design, which adds additional information in both forward and backward passes to improve the representation ability of quantized networks. Furthermore, to achieve flexible inference and break the upper limit of accuracy, we propose the Depth-dynamic Quantized Architecture (DQA). Our DQA allows for the trade-off between efficiency and accuracy during inference through weight sharing. Our comprehensive experiments show that QuantSR outperforms existing state-of-the-art quantized SR networks in terms of accuracy while also providing more competitive computational efficiency. In addition, we demonstrate the scheme's satisfactory architecture generality by providing QuantSR-C and QuantSR-T for both convolution and Transformer versions, respectively.

Dependencies

# Go to the default directory
pip install -r requirements.txt
python setup.py develop

Execution

# We provide script to test our 4-bit QuantSR-C
sh test.sh

Citation

If you find our work useful in your research, please consider citing:

@inproceedings{qin2023quantsr,
  author    = {Haotong Qin and Yulun Zhang and Yifu Ding and Yifan liu and Xianglong Liu and Martin Danelljan and Fisher Yu},
  title     = {QuantSR: Accurate Low-bit Quantization for Efficient Image Super-Resolution},
  booktitle = {Conference on Neural Information Processing Systems (NeurIPS)},
  year      = {2023}
}

quantsr's People

Contributors

htqin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

marenan ray-luo

quantsr's Issues

Some problems in QuantSR

Congratulations on the acceptance of your paper! I've read the initial draft of your paper on OpenReview, and I have three questions to ask you:

  1. Have you applied other SR quantization methods, such as PAMS, to Transformer and compared the results with your QuantSR-T? It seems that the results were not compared in your paper.

  2. In Table 2, the results of the DoReFa 2-bit quantization on the x4 SR network show higher accuracy than the results with a 4-bit quantization. Is there an issue with this?

  3. The introduction of DAQ in the paper is not detailed enough, and it's unclear how it skips certain layers. The corresponding code for this part has also not been made available.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.