QuantFactory/Ministral-3b-instruct-GGUF
This is quantized version of ministral/Ministral-3b-instruct created using llama.cpp
Original Model Card
Model Description
Ministral is a series of language model, build with same architecture as the famous Mistral model, but with less size.
- Model type: A 3B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- Language(s) (NLP): Primarily English
- License: Apache 2.0
- Finetuned from model: mistralai/Mistral-7B-v0.1
- Downloads last month
- 545
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.