leaderboard-pr-bot's picture
Adding Evaluation Results
f56a53d verified
|
raw
history blame
10.4 kB
metadata
language:
  - en
license: other
tags:
  - facebook
  - meta
  - pytorch
  - llama
  - llama-3
  - moe
  - code
base_model: meta-llama/Meta-Llama-3-8B-Instruct
pipeline_tag: text-generation
license_name: llama3
license_link: LICENSE
model-index:
  - name: Llama-3-Aplite-Instruct-4x8B-MoE
    results:
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: AI2 Reasoning Challenge (25-Shot)
          type: ai2_arc
          config: ARC-Challenge
          split: test
          args:
            num_few_shot: 25
        metrics:
          - type: acc_norm
            value: 61.86
            name: normalized accuracy
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: HellaSwag (10-Shot)
          type: hellaswag
          split: validation
          args:
            num_few_shot: 10
        metrics:
          - type: acc_norm
            value: 81.08
            name: normalized accuracy
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: MMLU (5-Shot)
          type: cais/mmlu
          config: all
          split: test
          args:
            num_few_shot: 5
        metrics:
          - type: acc
            value: 67.47
            name: accuracy
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: TruthfulQA (0-shot)
          type: truthful_qa
          config: multiple_choice
          split: validation
          args:
            num_few_shot: 0
        metrics:
          - type: mc2
            value: 51.87
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: Winogrande (5-shot)
          type: winogrande
          config: winogrande_xl
          split: validation
          args:
            num_few_shot: 5
        metrics:
          - type: acc
            value: 77.82
            name: accuracy
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard
      - task:
          type: text-generation
          name: Text Generation
        dataset:
          name: GSM8k (5-shot)
          type: gsm8k
          config: main
          split: test
          args:
            num_few_shot: 5
        metrics:
          - type: acc
            value: 61.87
            name: accuracy
        source:
          url: >-
            https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
          name: Open LLM Leaderboard

image/png

Aplite-Instruct-4x8B-Llama-3

Aplite-Instruct-4x8B-Llama-3 is a experimental MoE (Mixture of Experts) model based on the Llama-3 architecture using Mergekit.

GGUF Quant Here or (Made by mradermacher)

EXL2 Quant Here (Thanks to help of @Knightcodin)

<|eot_id|> bug has been fixed.

Disclaimer

This model is a research experiment and may generate incorrect or harmful content. The model's outputs should not be taken as factual or representative of the views of the model's creator or any other individual.

The model's creator is not responsible for any harm or damage caused by the model's outputs.

Chat Example

User: Write a story for four llamas.

Aplite-Instruct-4x8B:

Once upon a time, in a land far, far away, there lived four little llamas. They were the best of friends and lived together in a cozy little village nestled in the heart of the Andes mountains.

Their names were Lolo, Lola, Lina, and Lucas. Lolo was the smallest of the group, but he had the biggest heart. Lola was the oldest and wisest of the four, and she loved to tell stories and sing songs. Lina was the most adventurous, always eager to explore the surrounding hills and valleys. And Lucas was the most mischievous, always getting into trouble but always with a twinkle in his eye.

One day, the four little llamas decided to go on an adventure together. They packed up their favorite snacks and set off into the mountains. As they climbed higher and higher, the air grew cooler and the snow began to fall. They trudged on, not minding the cold, for they were having too much fun.

After a while, they came to a beautiful little lake, surrounded by towering peaks. The llamas decided to take a break and enjoy the view. They sat down on the snow, munching on their snacks and watching the snowflakes dance around them.

Just then, a gentle voice called out to them. "Hello, little llamas! What are you doing all the way up here?" The llamas looked up to see a kind old llama standing before them. She had a soft, gentle face and a warm smile.

"We're on an adventure," said Lolo. "We wanted to see the mountains and the snow."

"Ah, that's a good reason to come up here," said the old llama. "But you should be careful. The mountains can be treacherous."

"We're careful," said Lucas. "We're not afraid of the mountains."

The old llama smiled. "I see that. Well, I have a special gift for you. I've been saving it for a special occasion."

She led the four little llamas to a small cave, where she opened a wooden box. Inside were four small lanterns, each one glowing with a soft, golden light.

"These are magic lanterns," said the old llama. "When you light them, they will keep you safe from the dangers of the mountains. And they will always guide you back home."

The four little llamas were thrilled. They thanked the old llama and promised to be careful.



As they continued on their journey, the lanterns cast a warm, golden glow around them, lighting up the path ahead. They could see everything clearly, even in the midst of the falling snow.

After a while, they came to a narrow, winding path that led them to the top of a steep cliff. The path was slippery with ice, and the wind was blowing fiercely. But the lanterns shone brightly, and the four little llamas made their way safely to the top.

As they looked out over the breathtaking view, they saw a strange sight. A group of tiny creatures were scurrying about, gathering something from the ground. They looked like tiny, furry balls with long tails.

"What are they doing?" asked Lina.

"I think they're gathering berries," said Lola. "But they're so small!"

Just then, one of the creatures looked up and saw the four little llamas. It waved its tail and shouted, "Hello! We're the tiny penguins! We live here in the mountains, and we love to gather berries."

The four little llamas were amazed. "Tiny penguins?" they exclaimed. "We've never seen anything like you!"

The tiny penguins invited the llamas to join them in their berry-gathering. The llamas were happy to help, and soon they were all working together, laughing and chatting as they went.

After a while, they decided to take a break and enjoy their snack. The tiny penguins shared their berries with the llamas, and the llamas shared their snacks with the penguins.

As they sat together, munching on their treats, the tiny penguins asked the llamas to stay with them for a while. "We'd love to show you around our home," they said. "You can meet our friends and see our favorite spots."

The four little llamas were thrilled. They had never had such an exciting adventure before. And so they spent the rest of the day with the tiny penguins, exploring the mountains and making new friends.

As the sun began to set, the tiny penguins led the llamas back to the cliff path. They said goodbye to their new friends and wished them a safe journey home.

The four little llamas thanked the penguins and lit their lanterns once more. As they made their way back down the mountain, they could see the tiny penguins waving goodbye from the top of the cliff.

When they arrived back in their village, they told everyone about their amazing adventure. They showed them the lanterns and told them about the tiny penguins and their berry-gathering.

The villagers were amazed and delighted. They had never heard of such a wonderful adventure before. And from that day on, the four little llamas were known as the bravest and most adventurous llamas in the land.

And they lived happily ever after, always ready for their next adventure.

Merge Details

base_model: Meta-Llama-3-8B-Instruct
experts:
  - source_model: Meta-Llama-3-8B-Instruct
    positive_prompts:
    - "explain"
    - "chat"
    - "assistant"
  - source_model: Llama3-8B-OpenHermes-DPO
    positive_prompts:
    - "python"
    - "math"
    - "solve"
    - "code"
  - source_model: Llama-3-SLERP-8B
    positive_prompts:
    - "chat"
    - "assistant"
    - "AI"
  - source_model: hf-llama3-8b-orpo-v0.0
    positive_prompts:
    - "think"
    - "chat"
    - "code"
    - "roleplay"
gate_mode: hidden
dtype: float16

Join out Discord

If you'd like to discuss potential collaborations or applications, feel free to reach out to me on Discord: [https://discord.gg/KugcbJX5]

Meta Llama 3 is licensed under the Meta Llama 3 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 66.99
AI2 Reasoning Challenge (25-Shot) 61.86
HellaSwag (10-Shot) 81.08
MMLU (5-Shot) 67.47
TruthfulQA (0-shot) 51.87
Winogrande (5-shot) 77.82
GSM8k (5-shot) 61.87