yuiseki's picture
Upload folder using huggingface_hub
dc23f06 verified
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# tmp
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042 as a base.
### Models Merged
The following models were included in the merge:
* ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
* ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
dtype: bfloat16
merge_method: task_arithmetic
parameters:
int8_mask: 1.0
normalize: 0.0
slices:
- sources:
- layer_range: [0, 4]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: 1.4789314607153292
- layer_range: [0, 4]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: -0.07832427503458578
- layer_range: [0, 4]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: 0.13007468603078542
- sources:
- layer_range: [4, 8]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: 0.4483365644314186
- layer_range: [4, 8]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.16093134122246358
- layer_range: [4, 8]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: -0.1488430200266594
- sources:
- layer_range: [8, 12]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: -0.269727274803052
- layer_range: [8, 12]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: -0.3755435960674383
- layer_range: [8, 12]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: -0.2516657269635488
- sources:
- layer_range: [12, 16]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: 1.0256352345282738
- layer_range: [12, 16]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.09650999485083946
- layer_range: [12, 16]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: 0.1027707396989868
- sources:
- layer_range: [16, 20]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: 0.48894736524264926
- layer_range: [16, 20]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.2781634870717354
- layer_range: [16, 20]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: 0.36692173901259717
- sources:
- layer_range: [20, 24]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: -0.8662869061036966
- layer_range: [20, 24]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.31966012442261516
- layer_range: [20, 24]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: 0.279482404393017
- sources:
- layer_range: [24, 28]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: -0.392804806487449
- layer_range: [24, 28]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.16459417480133676
- layer_range: [24, 28]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: 0.3896145044637547
- sources:
- layer_range: [28, 32]
model: ./evol_merge_storage/input_models/RakutenAI-7B-instruct_2471054042
parameters:
weight: 1.4221277582195517
- layer_range: [28, 32]
model: ./evol_merge_storage/input_models/Mistral-7B-v0.1-coder-sql-en-v0.1_3262149936
parameters:
weight: 0.8487897976056422
- layer_range: [28, 32]
model: ./evol_merge_storage/input_models/Mistral-7B-Instruct-v0.2_674785087
parameters:
weight: -0.3464180330027282
tokenizer_source: base
```