Phi3Mix
Phi3Mix is a Mixture of Experts (MoE) made with the following models using Phi3_LazyMergekit:
𧩠Configuration
base_model: microsoft/Phi-3-mini-4k-instruct
gate_mode: cheap_embed
experts_per_token: 1
dtype: float16
experts:
- source_model: microsoft/Phi-3-mini-4k-instruct
positive_prompts: ["research, logic, math, science"]
- source_model: microsoft/Phi-3-mini-4k-instruct
positive_prompts: ["creative, art"]
π» Usage
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = "HassanStar/Phi3Mix"
tokenizer = AutoTokenizer.from_pretrained(model)
model = AutoModelForCausalLM.from_pretrained(
model,
trust_remote_code=True,
)
prompt="How many continents are there?"
input = f"<|system|>You are a helpful AI assistant.<|end|><|user|>{prompt}<|assistant|>"
tokenized_input = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(tokenized_input, max_new_tokens=128, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for HassanStar/Phi3Mix
Base model
microsoft/Phi-3-mini-4k-instruct