|
--- |
|
license: cc-by-nc-4.0 |
|
tags: |
|
- merge |
|
- mergekit |
|
- lazymergekit |
|
- pruning |
|
- alnrg2arg/blockchainlabs_7B_merged_test2_4 |
|
- mlabonne/NeuralBeagle14-7B |
|
- udkai/Turdus |
|
--- |
|
|
|
# blockchainlabs_7B_merged_test2_4_prune |
|
|
|
blockchainlabs_7B_merged_test2_4_prune is a pruned model based on alnrg2arg/blockchainlabs_7B_merged_test2_4, which is a merged model using |
|
following models using [mergekit](https://github.com/cg123/mergekit): |
|
* [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) |
|
* [udkai/Turdus](https://huggingface.co/udkai/Turdus) |
|
|
|
Pruning Kit I used: [wanda](https://github.com/locuslab/wanda?tab=readme-ov-file#ablation-on-obs-weight-update) |
|
|
|
## 🧩 Configuration |
|
|
|
```json |
|
{ |
|
"_name_or_path": "alnrg2arg/blockchainlabs_7B_merged_test2_4_prun", |
|
"architectures": [ |
|
"MistralForCausalLM" |
|
], |
|
"attention_dropout": 0.0, |
|
"bos_token_id": 1, |
|
"eos_token_id": 2, |
|
"hidden_act": "silu", |
|
"hidden_size": 4096, |
|
"initializer_range": 0.02, |
|
"intermediate_size": 14336, |
|
"max_position_embeddings": 32768, |
|
"model_type": "mistral", |
|
"num_attention_heads": 32, |
|
"num_hidden_layers": 32, |
|
"num_key_value_heads": 8, |
|
"rms_norm_eps": 1e-05, |
|
"rope_theta": 10000.0, |
|
"sliding_window": 4096, |
|
"tie_word_embeddings": false, |
|
"torch_dtype": "float16", |
|
"transformers_version": "4.36.2", |
|
"use_cache": false, |
|
"vocab_size": 32000 |
|
} |
|
|
|
``` |