--- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1 # Doc / guide: https://huggingface.co/docs/hub/model-cards {} --- # Model Card for Model ID This model is an extreme experiment, I wanted to test making an MoE out of multiple High Performing Solar models. Let me know what you think ## Model Details ### Model Description This MoE model is an extreme experiment, I wanted to test making an MoE out of multiple High Performing Solar models. Let me know what you think. Thinking about finetuning on a RP dataset later on to direct the model more - **Model type:** [More Information Needed] ### Model Sources [optional] model_name: Lumosia-MoE-4x10.7 base_model: DopeorNope/SOLARC-M-10.7B gate_mode: hidden dtype: bfloat16 experts: - source_model: DopeorNope/SOLARC-M-10.7B positive_prompts: [""] - source_model: maywell/PiVoT-10.7B-Mistral-v0.2-RP positive_prompts: [""] - source_model: kyujinpy/Sakura-SOLAR-Instruct positive_prompts: [""] - source_model: jeonsworld/CarbonVillain-en-10.7B-v1 positive_prompts: [""]