Edit model card

Re:MythoMax (ReMM) is a recreation trial of the original MythoMax-L2-B13 with updated models.

This merge use SLERP [TESTING] to merge ReML and Huginn v1.2.

Command useds and explaination :

Due to hardware limitation, some merge was done in 2 part.

- Recreate ReML : Mythologic (v2) (Chronos/Hermes/Airboros)
=> Replacing Chronos by The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 (0.30)
=> Replacing Airoboros by jondurbin/airoboros-l2-13b-2.1 (last version) (0.40)
=> Keeping NousResearch/Nous-Hermes-Llama2-13b (0.30)

Part 1: python ties_merge.py TheBloke/Llama-2-13B-fp16 ./ReML-L2-13B-part1  --merge The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 --density 0.42 --merge jondurbin/airoboros-l2-13b-2.1 --density 0.56 --cuda

Part 2: python ties_merge.py TheBloke/Llama-2-13B-fp16 ./ReML-L2-13B --merge NousResearch/Nous-Hermes-Llama2-13b --density 0.30 --merge Undi95/ReML-L2-13B-part1 --density 0.70 --cuda

With that :
- Recreate ReMM : MythoMax (v2) (Mythologic/Huginn v1)
=> Replacing Mythologic by the one above (0.5)
=> Replacing Huginn by The-Face-Of-Goonery/Huginn-13b-v1.2 (hottest) (0.5)

Part 3: python slerpmergelm.py "The-Face-Of-Goonery_Huginn-13b-v1.2" "Undi95_ReML-L2-13B" "result"

Version of SLERP used is different to accept usage on notebook : https://github.com/Undi95/LLM-SLERP-MergeTest/tree/main (Thanks @Vali)

Description

This repo contains fp16 files of ReMM-SLERP, a recreation of the original MythoMax, but updated and merged with SLERP.

Models used

  • TheBloke/Llama-2-13B-fp16 (base)
  • The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16
  • jondurbin/airoboros-l2-13b-2.1
  • NousResearch/Nous-Hermes-Llama2-13b
  • The-Face-Of-Goonery/Huginn-13b-v1.2
  • ReML-L2-13B (Private recreation trial of an updated Mythologic-L2-13B)

Prompt template: Alpaca

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Response:

Special thanks to Sushi kek

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 50.99
ARC (25-shot) 60.92
HellaSwag (10-shot) 83.56
MMLU (5-shot) 55.33
TruthfulQA (0-shot) 51.97
Winogrande (5-shot) 75.22
GSM8K (5-shot) 9.17
DROP (3-shot) 20.76
Downloads last month
801
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Undi95/ReMM-SLERP-L2-13B

Quantizations
3 models

Spaces using Undi95/ReMM-SLERP-L2-13B 29

Collection including Undi95/ReMM-SLERP-L2-13B