File size: 800 Bytes
08513da 1a178f7 6a2ca25 08513da 2b04760 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
---
license: apache-2.0
language:
- en
---
# Flux-Base-Optimized
`flux-base-optimized` is the base model for finetuning the series of `flux-7b` models.
It is hierarchical SLERP merged from the following models
* mistralai/Mistral-7B-v0.1 (Apache 2.0)
* teknium/OpenHermes-2.5-Mistral-7B (Apache 2.0)
* Intel/neural-chat-7b-v3-3 (Apache 2.0)
* meta-math/MetaMath-Mistral-7B (Apache 2.0)
* openchat/openchat-3.5-0106 was openchat/openchat-3.5-1210 (Apache 2.0)
Here's how we did the hierarchical SLERP merge.
```
[flux-base-optimized]
↑
|
[stage-1]-+-[openchat]
↑
|
[stage-0]-+-[meta-math]
↑
|
[openhermes]-+-[neural-chat]
```
|