Fine-Tuned Instruct Version is here.
Check The Fine-Tuned Version of mistral-community/Mixtral-8x22B-v0.1 here(HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1). π
Used Dataset: datasets/argilla/distilabel-capybara-dpo-7k-binarized. π
Zephyr is a series of language models designed as helpful assistants. Zephyr 141B-A35B, the latest model, is a fine-tuned version of mistral-community/Mixtral-8x22B-v0.1, trained using the Odds Ratio Preference Optimization (ORPO) algorithm with 7k instances for 1.3 hours on 4 nodes of 8 x H100s.
ORPO achieves high performance without requiring an SFT step, making it computationally efficient. Zephyr-141B-A35B was trained using the argilla/distilabel-capybara-dpo-7k-binarized preference dataset, containing synthetic, high-quality, multi-turn preferences scored via LLMs.
Official: Mixtral-8x22B-Instruct-v0.1 ππππ
MistralAI has uploaded weights to their organization at mistralai/Mixtral-8x22B-v0.1 and mistralai/Mixtral-8x22B-Instruct-v0.1 too.
- 176B MoE with approximately 40B active parameters
- Supports a context length of up to 65k tokens
- Fine-tuning available for the base model
- Requires approximately 260GB VRAM in fp16 or 73GB in int4
- Licensed under Apache 2.0, as stated on their Discord
- Utilizes a tokenizer similar to previous models
lol 260gb vram. Stuff is way out of reach of hobbyists. May be open, but for most, its unachievable anyway.
11x 3090 which motherboard support this?
Nvidia GeForce RTX 3090 can support motherboards that use the PCI-Express (PCIe) x16 slot.
11x 3090 which motherboard support this?
A few years ago I used Asus b250 mobo with 12 gpu 1080ti plus 3 rx 580
Total 15 card in mining, Asus B250 have 15 pci express slots in addition to you can add more with m2 to pcie converter