HX-Mistral-3B_v0.1 / README.md
Wauplin's picture
Wauplin HF staff
Upload folder using huggingface_hub
9e47729 verified
|
raw
history blame
1.19 kB
metadata
base_model:
  - mistralai/Mistral-7B-Instruct-v0.2
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the linear merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: linear
slices:
 - sources:
      - layer_range: [0, 16] # Assuming the first half of the model is more general and can be reduced more
        model: mistralai/Mistral-7B-Instruct-v0.2
        parameters:
          weight: 0.5 # Reduce the weight of the first half to make room for the second half
      - layer_range: [16, 32] # Assuming the second half of the model is more specialized and can be reduced less
        model: mistralai/Mistral-7B-Instruct-v0.2
        parameters:
          weight: 0.5 # Maintain the weight of the second half