llava-v1.6-mistral-7b-hf-nf4 is a bnb nf4 quant of llava-v1.6-mistral-7b-hf.
For batch processing you can use ide-cap-chan
All other features are inherited from the parent model.
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for 2dameneko/llava-v1.6-mistral-7b-hf-nf4
Base model
llava-hf/llava-v1.6-mistral-7b-hf