Foxglove_7B / README.md
Ramadhirra
Update README.md
25c4bd5 verified
|
raw
history blame
No virus
904 Bytes
metadata
tags:
  - merge
  - mergekit
  - lazymergekit
  - mistral
  - ResplendentAI/Datura_7B
  - Epiculous/Mika-7B
base_model:
  - ResplendentAI/Datura_7B
  - Epiculous/Mika-7B
language:
  - en
library_name: transformers
license: apache-2.0

favicon Foxglove_7B

Foxglove_7B is a merge of the following models using LazyMergekit:

Configuration

  • Slices:
    • Sources:
      • Model: ResplendentAI/Datura_7B
      • Model: Epiculous/Mika-7B
  • Merge Method: SLERP
  • Base Model: ResplendentAI/Datura_7B