TaxDocumentBeigePaint
This is a merge of pre-trained language models created using mergekit.
⚠️ Development Notice – Stage 1 of 3
This is an early-stage merge prototype.
It has only undergone brief testing and exists to verify architecture and tokenizer stability.
Next steps:
2️⃣ Fine-tuning
3️⃣ DPO Swipe Studio alignment
then meeebeee Public release candidateUse at your own risk — and please, for the love of anarchic cryptids, don’t production-deploy a beta. 🧌
Merge Details
Merge Method
This model was merged using the TIES merge method using aixonlab/Aether-12b as a base.
Models Merged
The following models were included in the merge:
- aixonlab/Aether-12b
- anthracite-org/magnum-v2-12b
- D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
- nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
Configuration
The following YAML configuration was used to produce this model:
models:
- model: aixonlab/Aether-12b
parameters:
weight: 0.40
- model: anthracite-org/magnum-v2-12b
parameters:
weight: 0.30
- model: D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
parameters:
weight: 0.15
- model: nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
parameters:
weight: 0.15
merge_method: ties
base_model: aixonlab/Aether-12b
parameters:
density: 0.45
dtype: float16
🧌 Maintained by: Your Mum
🧠 Variant: Text-only, 12B mistral nemo merge
💾 Upload date: October 2025
☕ Notes: Made with stubbornness, Python, and profanity.
- Downloads last month
- 20