Foxglove_7B / README.md
Ramadhirra
Update README.md
6726933 verified
|
raw
history blame
845 Bytes
metadata
tags:
  - merge
  - mergekit
  - lazymergekit
  - mistral
  - ResplendentAI/Datura_7B
  - Epiculous/Mika-7B
base_model:
  - ResplendentAI/Datura_7B
  - Epiculous/Mika-7B
language:
  - en
library_name: transformers
license: other

favicon Foxglove_7B

Foxglove_7B is a merge of the following models using LazyMergekit:

Configuration

  • Slices:
    • Sources:
      • Model: ResplendentAI/Datura_7B
      • Model: Epiculous/Mika-7B
  • Merge Method: SLERP
  • Base Model: ResplendentAI/Datura_7B