library_name: transformers | |
tags: [] | |
# Mistral-Small-Instruct-2409 - EXL2 6.8 rpcal_mk2 | |
This is a 8bpw EXL2 quant of [mistralai/Mistral-Small-Instruct-2409](https://huggingface.co/mistralai/Mistral-Small-Instruct-2409) | |
This quant was made using exllamav2-0.2.2 with [Fullmoon-light dataset](https://huggingface.co/datasets/ParasiticRogue/Fullmoon-Light) for RP. | |
I tested this quant shortly in some random RPs (including ones over 8k and 16k context) and it seems to work fine. | |
## Prompt Templates | |
Uses Mistral format. | |
For more details see original model. |