Edit model card

Description

This repo contains fp16 files of C-Based-2x7B.

Created by taking the best benchmark scoring model with the smallest size on HF leaderboard as of today (29/03/2024) which is zhengr/MixTAO-7Bx2-MoE-v8.1 and merging to it some MoE done by myself using the human feedback data from Chaiverse specifically to have high level of RP, intelligence and usage.

Since this is a frankenmoe, I really don't know what the result will be leaderboard side, but what interest me is the human interaction realism anyway (for RP/ERP).

Prompt template: Alpaca

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{system prompt}

### Input:
{prompt}

### Response:
{output}

If you want to support me, you can here.

Downloads last month
76
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Undi95/C-Based-2x7B

Quantizations
1 model