minho park
minhopark-neubla
·
AI & ML interests
None yet
Organizations
Collections
5
-
Mixtral of Experts
Paper • 2401.04088 • Published • 157 -
MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts
Paper • 2401.04081 • Published • 70 -
TinyLlama: An Open-Source Small Language Model
Paper • 2401.02385 • Published • 89 -
LLaMA Pro: Progressive LLaMA with Block Expansion
Paper • 2401.02415 • Published • 53
models
None public yet
datasets
None public yet