Llama3-Omphalos-12B
Llama3-Omphalos-12B is a merge of the following models using LazyMergekit:
🧩 Configuration
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 8]
model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- sources:
- layer_range: [4, 12]
model: Casual-Autopsy/L3-Umbral-Mind-RP-v1.0-8B
- sources:
- layer_range: [9, 16]
model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- sources:
- layer_range: [13, 20]
model: Casual-Autopsy/L3-Umbral-Mind-RP-v1.0-8B
- sources:
- layer_range: [17, 24]
model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- sources:
- layer_range: [21, 28]
model: Casual-Autopsy/L3-Umbral-Mind-RP-v1.0-8B
- sources:
- layer_range: [25, 32]
model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Tremontaine/Llama3-Omphalos-12B
Merge model
this model