Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mobiuslabsgmbh
/
Mixtral-8x7B-v0.1-hf-attn-4bit-moe-2bit-HQQ
like
6
Follow
Mobius Labs GmbH
55
Text Generation
Transformers
mixtral
Mixture of Experts
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
refs/pr/1
Mixtral-8x7B-v0.1-hf-attn-4bit-moe-2bit-HQQ
Commit History
Librarian Bot: Add moe tag to model
2534903
librarian-bot
commited on
Jan 8
Update README.md
b4d5653
mobicham
commited on
Dec 18, 2023
Update README.md
88e72ff
mobicham
commited on
Dec 18, 2023
Update README.md
c30159c
mobicham
commited on
Dec 18, 2023
Update README.md
bd7e5f5
mobicham
commited on
Dec 15, 2023
upload model
6f6f04b
mobicham
commited on
Dec 15, 2023
initial commit
94cefda
mobicham
commited on
Dec 15, 2023