GithubHelp home page GithubHelp logo

Comments (6)

poonehmousavi avatar poonehmousavi commented on June 12, 2024

I totally agree.. what we have done for LLama2 is the QLora(Qunatization+Lora) and only applicable for LLama2. I think with this trend of public LLM released so often, and all of them need Lora for fine-tuning, we need to have an unified class that could handle Lora(and Qlora) for different models. In HF, sft_trainer.py could handle this efficient fine-tuning. We also need to have an easy interface in SB for efficient fine-tuning. It should also work well with DDP.

from speechbrain.

mravanelli avatar mravanelli commented on June 12, 2024

I think it is very important too. We can support at least LoRA and adapters. I'm not sure what could be the best way to go to support it elegantly. One idea could be to implement a wrapper (e.g., Adapter) to apply to our models, which can plug the necessary new modules. However, it seems to me quite hard to create something easy to use and flexible at the same time. Any idea from the community?

from speechbrain.

pplantinga avatar pplantinga commented on June 12, 2024

I have used PEFT to apply Lora to an existing model and it was pretty straightforward to use. You just pass the model and it automatically replaces all relevant layers with Lora layers. We could do something similar

from speechbrain.

pplantinga avatar pplantinga commented on June 12, 2024

We could even import and use peft if the dependencies are light

from speechbrain.

Adel-Moumen avatar Adel-Moumen commented on June 12, 2024

Hi @TParcollet, I agree with your strategy.

I have used PEFT to apply Lora to an existing model and it was pretty straightforward to use. You just pass the model and it automatically replaces all relevant layers with Lora layers. We could do something similar

I don't think that PEFT library is necessary since the theory/implementation is quite "easy" to reproduce. For instance, ESPNet has their own impl of LORA etc (https://github.com/espnet/espnet/blob/47de29c21f5a7db22089717e92add5e9604fcd48/espnet2/layers/create_adapter_fn.py#L224). We should follow the same strategy and provide our own adapters because many researchers may want to develop their own design/modify code etc which may be harder if we have speechbrain and peft.

Note: we could have our own implementation and also provide a PEFT compatible wrapper. But I don't know if this make sense/necessary.

from speechbrain.

TParcollet avatar TParcollet commented on June 12, 2024

Alright, I think that there are two problems here.

  1. Building a wrapper that will naively replace Linear layers with LoRA / Houlsby and others is simple, I did it, and it's not that hard. However, this is only valid for basic research purposes, and making it slightly more evolved e.g. only specifying some linear layers etc, may be a bit harder.
  2. Building something that is usable in practice, with quantisation, fast implementation etc may be MUCH more complicated. I mean, the literature on adapters for LLM is absurdly massive, and supporting this is impossible. Here, I'd say that supporting an extra lib could make sense, but it would need to be something very stable, and reliable AND that covers more than what we can do in 1. Otherwise, there is no point losing control over the code.

I am happy to do a PR for 1., but it will remain shallow IMHO.

from speechbrain.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.