binarycrayon / lit-llama Goto Github PK
View Code? Open in Web Editor NEWThis project forked from lightning-ai/lit-llama
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
License: Apache License 2.0