license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- Trelis/Llama-2-7b-chat-hf-function-calling-v3 | |
- xalss/Qwen2-7B-Instruct-glaive-function-calling | |
- InterSync/Mistral-7B-Instruct-v0.2-Function-Calling | |
- NousResearch/Hermes-2-Pro-Mistral-7B | |
# Uma-4x7B-MoE-Function-calling-v0.2 | |
Hey there! π Welcome to the Uma-4x7B-MoE-Function-calling-v0.2! This is a merge of multiple models brought together using the awesome [VortexMerge kit](https://colab.research.google.com/drive/1YjcvCLuNG1PK7Le6_4xhVU5VpzTwvGhk#scrollTo=UG5H2TK4gVyl). | |
Let's see what we've got in this merge: | |
* [Trelis/Llama-2-7b-chat-hf-function-calling-v3](https://huggingface.co/Trelis/Llama-2-7b-chat-hf-function-calling-v3) π | |
* [xalss/Qwen2-7B-Instruct-glaive-function-calling](https://huggingface.co/xalss/Qwen2-7B-Instruct-glaive-function-calling) π | |
* [InterSync/Mistral-7B-Instruct-v0.2-Function-Calling](https://huggingface.co/InterSync/Mistral-7B-Instruct-v0.2-Function-Calling) π | |
* [NousResearch/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B) π | |
## 𧩠Configuration | |
```yaml | |
models: | |
- model: Trelis/Llama-2-7b-chat-hf-function-calling-v3 | |
parameters: | |
weight: 0.30 | |
- model: xalss/Qwen2-7B-Instruct-glaive-function-calling | |
parameters: | |
weight: 0.60 | |
- model: InterSync/Mistral-7B-Instruct-v0.2-Function-Calling | |
parameters: | |
weight: 0.90 | |
- model: NousResearch/Hermes-2-Pro-Mistral-7B | |
parameters: | |
weight: 1.0 | |
merge_method: linear | |
dtype: float16 | |