Edit model card

Winter Garden 7B - Γ

It was mentioned that we are in the open ai dark winter; so I thought I would make myself a nice winter garden.

An experiment

This time I did something a bit different.

  • Mistral-7B-v0.1

and merged in

  • Yarn-Mistral-7b-128k
  • Thespis-Balanced-7b-v1
  • ZySec-7B-v1
  • LemonadeRP-4.5.3
  • Noromaid-7B-0.4-DPO
  • Prima-LelantaclesV6-7b
  • West-Hermes-7B
  • Capricorn-7B-DPO
  • kun-kunoichi-v1-DPO-v2-SLERP-7B
  • Kunoichi-DPO-v2-7B
  • WestLake-7B-v2-laser-truthy-dpo
  • StrangeMerges_6-7B-dare_ties
  • NeuralMarcoro14-7B
  • multi_verse_model
  • Multi-Verse-RP-7B
  • MonarchLake-7B
  • AlphaMonarch-7B

in an iterative DARE-TIES tree merge, ordering the merge order by tensor-relative cosine similarity until the merge branches resolve to a single value.

Chat Template

Basic Mistral <s>[INST][/INST] works pretty well. It seems smart, but we will see.

Scores

Metric Score

Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |

Downloads last month
69
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.