Foxglove_7B / README.md
Ramadhirra
Update README.md
25c4bd5 verified
|
raw
history blame
No virus
904 Bytes
---
tags:
- merge
- mergekit
- lazymergekit
- mistral
- ResplendentAI/Datura_7B
- Epiculous/Mika-7B
base_model:
- ResplendentAI/Datura_7B
- Epiculous/Mika-7B
language:
- en
library_name: transformers
license: apache-2.0
---
# <img src="https://cdn-icons-png.flaticon.com/512/1531/1531037.png" alt="favicon" style="display: inline-block; vertical-align: middle; width: 20px; height: 20px; margin-right: 10px;"> Foxglove_7B
Foxglove_7B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B)
* [Epiculous/Mika-7B](https://huggingface.co/Epiculous/Mika-7B)
## Configuration
- **Slices:**
- **Sources:**
- Model: ResplendentAI/Datura_7B
- Model: Epiculous/Mika-7B
- **Merge Method:** SLERP
- **Base Model:** ResplendentAI/Datura_7B