Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
inclusionAI
/
Ming-flash-omni-Preview
like
64
Follow
inclusionAI
1.31k
Any-to-Any
Diffusers
Safetensors
English
bailingmm_moe_v2_lite
arxiv:
2510.24821
arxiv:
2506.09344
License:
mit
Model card
Files
Files and versions
xet
Community
2
Use this model
vllm / sglang support?
#2
by
CHNtentes
- opened
Oct 28
Discussion
CHNtentes
Oct 28
It's painfully slow to run MoE models with transformers...
See translation
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Comment
·
Sign up
or
log in
to comment