Lora
2 mentions across 1 person
Visit ↗All mentions
“[1]: [LoRA](https://arxiv.org/abs/2106.09685): Low-Rank Adaptation of Large Language Models, Hu et al. 2021”
Mistral Inference: Open-weight Model Deployment and Usage ↗“It is based on [LoRA](https://arxiv.org/abs/2106.09685), a training paradigm where most weights are frozen and only 1-2% of additional weights in the form of low-rank matrix perturbations are trained.”
Mistral-finetune: A LoRA-based Solution for Memory-Efficient Fine-tuning of Mist ↗