metadata
license: apache-2.0
language:
- en
- fr
- de
Charluv pretrained 13B model
Based on
- Pygmalion 6B
- LLaMA 13B
Fine Tuning
- Charluv Dataset 400M
wbits 4 groupsize 128
Perplexity
5.25
We run this model on charluv.com
This model is NSFW
Created for fast inference with KoboldAI 4bit version (GPTQ version 1)