HeilbronnGPT-Alpha
Collection
A collection of German finetuned language models. All based on Microsoft Phi 2.
•
12 items
•
Updated
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using microsoft/phi-2 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: microsoft/phi-2
- model: /home/ubuntu/llm_mill/output/00_merged_phi-2_airoboros-3.0_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/01_merged_phi-2_alpaca-gpt4_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/02_merged_phi-2_booksum_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/03_merged_phi-2_dolly-15k_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/04_merged_phi-2_dolphin_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/05_merged_phi-2_evol-instruct_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/06_merged_phi-2_oasst_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/07_merged_phi-2_openschnabeltier_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/08_merged_phi-2_ultrachat_chat_de
parameters:
density: 0.5
weight: 0.5
- model: /home/ubuntu/llm_mill/output/09_merged_phi-2_wiki_qa_de
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: microsoft/phi-2
parameters:
normalize: true
dtype: float16