Nan-Do's picture
Fixed the model_creator meta data field
8b4a2ed verified
---
base_model: TomGrc/FusionNet_7Bx2_MoE_14B
inference: true
license: mit
model-index:
- name: FusionNet_7Bx2_MoE_14B
results: []
model_creator: TomGrc
model_name: FusionNet_7Bx2_MoE_14B
model_type: mixtral
quantized_by: Nan-Do
tags:
- mixtral
- Mixture of Experts
- quantization
---
<!-- markdownlint-disable MD041 -->
# FusionNet_7Bx2_MoE_14B
- Original model: [FusionNet_7Bx2_MoE_14B](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B)
<!-- description start -->
## Description
This repo contains GGUF format model files for [FusionNet_7Bx2_MoE_14B](https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B).
<!-- README_GGUF.md-provided-files start -->
## Provided files
| Name | Quantisation method | Bits | Size |
| ---- | :----: | ----: | ----: |
| [FusionNet_7Bx2_MoE_14B-Q3_K_S.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q3_K_S.gguf) | Q3_KS | 3 | 5.59 GB|
| [FusionNet_7Bx2_MoE_14B-Q3_K.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q3_K.gguf) | Q3_K | 3 | 6.21 GB|
| [FusionNet_7Bx2_MoE_14B-Q4_0.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q4_0.gguf) | Q4_0 | 4 | 7.28 GB|
| [FusionNet_7Bx2_MoE_14B-Q4_K_S.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q4_K_S.gguf) | Q4_K_S | 4 | 7.34 GB|
| [FusionNet_7Bx2_MoE_14B-Q4_K.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q4_K.gguf) | Q4_K | 4 | 7.78 GB|
| [FusionNet_7Bx2_MoE_14B-Q4_1.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q4_1.gguf) | Q4_1 | 4 | 8.08 GB|
| [FusionNet_7Bx2_MoE_14B-Q5_K_S.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q5_K_S.gguf) | Q5_KS | 5 | 8.87 GB|
| [FusionNet_7Bx2_MoE_14B-Q5_K.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q5_K.gguf) | Q5_K | 5| 9.13 GB|
| [FusionNet_7Bx2_MoE_14B-Q6_K.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q6_K.gguf) | Q6_K | 6| 10.6 GB|
| [FusionNet_7Bx2_MoE_14B-Q8_0.gguf](https://huggingface.co/Nan-Do/FusionNet_7Bx2_MoE_14B-GGUF/resolve/main/FusionNet_7Bx2_MoE_14B-Q8_0.gguf) | Q8_0 | 8| 13.7 GB|
<!-- original-model-card end -->