Eclipsed-Prism-12B

Overview

Eclipsed-Prism-12B was created through a multi-stage merge involving Starlit-Shadow-12B, Shining-Prism-12B, EtherealAurora-12B, EsotericSage-12B, and Hollow-Aether-12B using custom methods.

Multi-stage merge configuration
name: First
merge_method: acl
base_model: Vortex5/Starlit-Shadow-12B
models:
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  strength: 0.75
  selectivity: 0.95
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Second
merge_method: amsf
models:
  - model: First
  - model: yamatazen/EsotericSage-12B
  - model: Vortex5/Hollow-Aether-12B
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
name: Third
merge_method: saef
models:
  - model: Second
  - model: Vortex5/Shining-Prism-12B
  - model: yamatazen/EtherealAurora-12B
parameters:
  paradox: 0.45
  strength: 0.9
  boost: 0.55
  modes: 2
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
---
#no name needed for final model
merge_method: sm2f
base_model: Third
models:
  - model: Vortex5/Starlit-Shadow-12B
parameters:
  focus: 0.55
  trust: 0.60
dtype: bfloat16
tokenizer:
  source: Vortex5/Starlit-Shadow-12B
      

Intended Use

๐ŸŒ’ Storytelling
๐ŸŽญ Roleplay
โœจ Creative writing
Downloads last month
2
Safetensors
Model size
12B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Vortex5/Eclipsed-Prism-12B