LoneStriker's picture
ExLLaMA V2 quant of Mistral-11B-CC-Air-8.0bpw-h6-exl2
0b46631
|
raw
history blame
1.14 kB
---
license: apache-2.0
tags:
- mistral
- pretrained
---
CollectiveCognition-v1.1-Mistral-7B and airoboros-mistral2.2-7b glued together.
<!-- description start -->
## Description
This repo contains fp16 files of Mistral-11B-CC-Air.
<!-- description end -->
<!-- description start -->
## Model used
- [CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B)
- [airoboros-mistral2.2-7b](https://huggingface.co/teknium/airoboros-mistral2.2-7b/)
<!-- description end -->
<!-- prompt-template start -->
## Prompt template: Alpaca or default
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
```
```
USER: <prompt>
ASSISTANT:
```
## The secret sauce
```
slices:
- sources:
- model: teknium/CollectiveCognition-v1.1-Mistral-7B
layer_range: [0, 24]
- sources:
- model: teknium/airoboros-mistral2.2-7b
layer_range: [8, 32]
merge_method: passthrough
dtype: float16
```
Special thanks to Sushi.
If you want to support me, you can [here](https://ko-fi.com/undiai).