Mistralinference
2 mentions across 1 person
Visit ↗All mentions
“This repository contains minimal code to run Mistral models.”
Mistral Inference: Open-weight Model Deployment and Usage ↗“Once your model is trained, you should try it out in inference. We recommend using [mistral-inference](https://github.com/mistralai/mistral-inference).”
Mistral-finetune: A LoRA-based Solution for Memory-Efficient Fine-tuning of Mist ↗