Hasnonname's picture
Create README.md
a3fc784 verified
|
raw
history blame
382 Bytes

trained on a mixture of synthetic and natural RP data, as well as a mixture of storywriting/novel data from various sources (sugarquill, SCP, and miscellaneous novels) for 17-ish hours on 2x3090 from runpod

quants: https://huggingface.co/Hasnonname/Qwen2.5-Monte-7B-v0.0-GGUF

it's either overcooked or undercooked, and I can't tell which. regardless, thanks for giving it a shot.