flan-t5-small-moe
flan-t5-small-moe is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
- google/flan-t5-small
- google/flan-t5-small
- google/flan-t5-small
- google/flan-t5-small
- google/flan-t5-small
- google/flan-t5-small
- google/flan-t5-small
✨ Configuration
base_model: google/flan-t5-small
merge_method: moe
gate_mode: hidden
experts:
- source_model: google/flan-t5-small
positive_prompts: ["instruction", "task"]
- source_model: google/flan-t5-small
positive_prompts: ["reasoning", "logic"]
- source_model: google/flan-t5-small
positive_prompts: ["creative", "writing"]
- source_model: google/flan-t5-small
positive_prompts: ["code", "programming"]
- source_model: google/flan-t5-small
positive_prompts: ["science", "facts"]
- source_model: google/flan-t5-small
positive_prompts: ["math", "calculation"]
- source_model: google/flan-t5-small
positive_prompts: ["summary", "dialogue"]
- Downloads last month
- 138
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support