NaruMOE-3x7B-v2-AWQ / README.md
Suparious's picture
Updated and moved existing to merged_models base_model tag in README.md
11141a4 verified
|
raw
history blame
857 Bytes
metadata
base_model: Alsebay/NaruMOE-3x7B-v2
inference: false
library_name: transformers
license: cc-by-nc-4.0
merged_models:
  - Alsebay/NarumashiRTS-V2
  - SanjiWatsuki/Kunoichi-DPO-v2-7B
  - Nitral-AI/KukulStanta-7B
pipeline_tag: text-generation
quantized_by: Suparious
tags:
  - moe
  - merge
  - roleplay
  - Roleplay
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible

Alsebay/NaruMOE-3x7B-v2 AWQ

Model Summary

A MoE model for Roleplaying. Since 7B model is small enough, we can combine them to a bigger model (Which CAN be smarter).

Adapte (some limited) TSF (Trans Sexual Fiction) content because I have include my pre-train model in.

Worse than V1 in logic, but better in expression.