language: | |
- en | |
license: apache-2.0 | |
library_name: transformers | |
tags: | |
- code | |
- QA | |
- reasoning | |
# Model Card for Model ID | |
<!-- Provide a quick summary of what the model is/does. --> | |
## Model Details | |
### Model Description | |
<!-- Provide a longer summary of what this model is. --> | |
A power full MOE 4x7b mixtral of mistral models consists of for more accuracy and precision in general reasoning, QA and code. | |
HuggingFaceH4/zephyr-7b-beta | |
mistralai/Mistral-7B-Instruct-v0.2 | |
teknium/OpenHermes-2.5-Mistral-7B | |
Intel/neural-chat-7b-v3-3 | |
- **Developed by:** NEXT AI | |
- **Funded by :** Zpay Labs Pvt Ltd. | |
- **Model type:** Mixtral of Mistral 4x7b | |
- **Language(s) (NLP):** Code-Reasoning-QA | |
- | |
### Model Sources | |
<!-- Provide the basic links for the model. --> | |
- **Demo :** Https://nextai.co.in | |