4x1b

Hey there! πŸ‘‹ Welcome to the 4x1b! This is a merge of multiple models brought together using the awesome VortexMerge kit.

Let's see what we've got in this merge:

🧩 Configuration

dtype: float16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 8]
    model: OEvortex/HelpingAI2-6B
- sources:
  - layer_range: [4, 12]
    model: OEvortex/HelpingAI2-6B
- sources:
  - layer_range: [8, 16]
    model: OEvortex/HelpingAI2-6B
- sources:
  - layer_range: [12, 21]
    model: OEvortex/HelpingAI2-6B
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.