Edit model card

image/png

Medorca-2x7b

Medorca-2x7b is a Mixure of Experts (MoE) made with the following models:

Evaluations

Benchmark Medorca-2x7b Orca-2-7b llama-2-7b meditron-7b meditron-70b
MedMCQA
ClosedPubMedQA
PubMedQA
MedQA
MedQA4
MedicationQA
MMLU Medical
MMLU 53.3 56.37
TruthfulQA 48.04 52.45
GSM8K 20.64 14.71
ARC 54.1 54.1
HellaSwag 76.04 76.19
Winogrande 74.51 73.48

More details on the Open LLM Leaderboard evaluation results can be found here

🧩 Configuration

base_model: microsoft/Orca-2-7b
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: epfl-llm/meditron-7b
    positive_prompts: 
      - "How does sleep affect cardiovascular health?"
      - "Could a plant-based diet improve arthritis symptoms?"
      - "A patient comes in with symptoms of dizziness and nausea..."
      - "When discussing diabetes management, the key factors to consider are..."
      - "The differential diagnosis for a headache with visual aura could include..."
    negative_prompts:
      - "Recommend a good recipe for a vegetarian lasagna."
      - "Give an overview of the French Revolution."
      - "Explain how a digital camera captures an image."
      - "What are the environmental impacts of deforestation?"
      - "The recent advancements in artificial intelligence have led to developments in..."
      - "The fundamental concepts in economics include ideas like supply and demand, which explain..."
  - source_model: microsoft/Orca-2-7b
    positive_prompts:
      - "Here is a funny joke for you -"
      - "When considering the ethical implications of artificial intelligence, one must take into account..."
      - "In strategic planning, a company must analyze its strengths and weaknesses, which involves..."
      - "Understanding consumer behavior in marketing requires considering factors like..."
      - "The debate on climate change solutions hinges on arguments that..."
    negative_prompts:
      - "In discussing dietary adjustments for managing hypertension, it's crucial to emphasize..."
      - "For early detection of melanoma, dermatologists recommend that patients regularly check their skin for..."
      - "Explaining the importance of vaccination, a healthcare professional should highlight..."

πŸ’» Usage

!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "Technoculture/Medorca-2x7b"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16},
)

messages = [{"role": "user", "content": "Why am i feeling so tired this month?"}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
Downloads last month
8
GGUF
Model size
11.1B params
Architecture
llama

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .