Edit model card

An extended part of my effort to create Eileithyia-20B. This model is made by following the recipe below, inverting it, then SLERPing the models back together at 0.5, hopefully fusing the models into one block for use with Harmonia.

slices:

  • sources: - model: microsoft/Orca-2-13b

    layer_range: [0, 16]
  • sources: - model: athirdpath/Eileithyia-13B

    layer_range: [8, 24]
  • sources: - model: microsoft/Orca-2-13b

    layer_range: [17, 32]
  • sources: - model: athirdpath/Eileithyia-13B

    layer_range: [25, 40]

merge_method: passthrough

dtype: float16

Thanks to Undi95 for pioneering the recipe.

Downloads last month
1,346
Safetensors
Model size
20B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for athirdpath/CleverMommy-mix-20b

Quantizations
2 models