Steelskull commited on
Commit
884d363
1 Parent(s): b9c104a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +66 -1
README.md CHANGED
@@ -23,6 +23,9 @@ Why? Dunno whated to see what would happen
23
 
24
  context is maybe 16k? waiting for GGUF to upload.
25
 
 
 
 
26
 
27
  Template:
28
  ```
@@ -33,6 +36,13 @@ Template:
33
  ### Assistant:
34
  ```
35
 
 
 
 
 
 
 
 
36
  Lumosia-MoE-4x10.7 is a Mixure of Experts (MoE) made with the following models:
37
  * [DopeorNope/SOLARC-M-10.7B](https://huggingface.co/DopeorNope/SOLARC-M-10.7B)
38
  * [maywell/PiVoT-10.7B-Mistral-v0.2-RP](https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2-RP)
@@ -41,9 +51,64 @@ Lumosia-MoE-4x10.7 is a Mixure of Experts (MoE) made with the following models:
41
 
42
  ## Evals:
43
 
44
-
45
  * Pending
46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
  ## 🧩 Configuration
49
 
 
23
 
24
  context is maybe 16k? waiting for GGUF to upload.
25
 
26
+ Chat-instruct breaks the model at the moment, not really sure why, even tho it will follow instructions.
27
+
28
+
29
 
30
  Template:
31
  ```
 
36
  ### Assistant:
37
  ```
38
 
39
+
40
+ Settings:
41
+ ```
42
+ Temp: 1.0
43
+ min-p: 0.02-0.1
44
+ ```
45
+
46
  Lumosia-MoE-4x10.7 is a Mixure of Experts (MoE) made with the following models:
47
  * [DopeorNope/SOLARC-M-10.7B](https://huggingface.co/DopeorNope/SOLARC-M-10.7B)
48
  * [maywell/PiVoT-10.7B-Mistral-v0.2-RP](https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2-RP)
 
51
 
52
  ## Evals:
53
 
 
54
  * Pending
55
 
56
+ ## Examples:
57
+ ```
58
+ Example 1:
59
+
60
+ User:
61
+ Hello lumi, what is the meaning of life
62
+ in around 100 words.
63
+
64
+ Lumosia:
65
+ The meaning of life can vary greatly
66
+ depending on individual beliefs,
67
+ values, and experiences. One common
68
+ perspective suggests that the primary
69
+ purpose of life is to seek happiness,
70
+ fulfillment, and personal growth
71
+ through relationships
72
+ accomplishments, and spiritual
73
+ awareness. Living with integrity.
74
+ compassion, and gratitude while
75
+ making a positive impact on others
76
+ and the world around us can also
77
+ contribute to finding meaning in life
78
+ UItimately, the meaning of life is
79
+ subjective and unique to each
80
+ individual, who must define it for
81
+ themselves through exploration
82
+ reflection, and personal choice.
83
+
84
+ ```
85
+ ```
86
+ Example 2:
87
+
88
+ User:
89
+ Explain a MoE (Mixture of Experts) in
90
+ around 100 words
91
+
92
+ Lumosia:
93
+ A MoE, short for Mixture of Experts, is
94
+ a machine learning technique utilized
95
+ in ensemble learning. It involves
96
+ combining multiple specialist models
97
+ (experts) to optimize predictive
98
+ performance. Each expert focuses on
99
+ a subset of the input features or data
100
+ domains, making them specialized in
101
+ particular areas. Through a weighting
102
+ mechanism, like Gaussian Mixture
103
+ Model, the predictions of experts are
104
+ blended to produce a final result. MoE
105
+ models demonstrate effectiveness in
106
+ handling complex or ambiguous
107
+ inputs where a single model might
108
+ struggle. They are commonly used in
109
+ natural language processing.
110
+ computer vision, and speech synthesis.
111
+ ```
112
 
113
  ## 🧩 Configuration
114