Phi3Mix / README.md
mccoole's picture
Upload folder using huggingface_hub
68041c6 verified
|
raw
history blame
1.55 kB
metadata
license: apache-2.0
tags:
  - moe
  - merge
  - mergekit
  - lazymergekit
  - phi3_mergekit
  - microsoft/Phi-3-mini-4k-instruct
base_model:
  - microsoft/Phi-3-mini-4k-instruct
  - microsoft/Phi-3-mini-4k-instruct

Phi3Mix

Phi3Mix is a Mixture of Experts (MoE) made with the following models using Phi3_LazyMergekit:

🧩 Configuration

base_model: microsoft/Phi-3-mini-4k-instruct
gate_mode: cheap_embed
experts_per_token: 1
dtype: float16
experts:
  - source_model: microsoft/Phi-3-mini-4k-instruct
    positive_prompts: ["research, logic, math, science"]
  - source_model: microsoft/Phi-3-mini-4k-instruct
    positive_prompts: ["creative, art"]

💻 Usage

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model = "mccoole/Phi3Mix"

tokenizer = AutoTokenizer.from_pretrained(model)

model = AutoModelForCausalLM.from_pretrained(
    model,
    trust_remote_code=True,
)

prompt="How many continents are there?"
input = f"<|system|>You are a helpful AI assistant.<|end|><|user|>{prompt}<|assistant|>"
tokenized_input = tokenizer.encode(input, return_tensors="pt")

outputs = model.generate(tokenized_input, max_new_tokens=128, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(tokenizer.decode(outputs[0]))