DavidAU commited on
Commit
786e35a
·
verified ·
1 Parent(s): 6006e49

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -96,7 +96,14 @@ Example outputs below.
96
 
97
  <B>Meet the Team: Mixture of Experts Models</b>
98
 
99
- This model is comprised of the following 4 models ("the experts") (in full):
 
 
 
 
 
 
 
100
 
101
  The mixture of experts is set at 2 experts, but you can use 3 or 4 too.
102
 
 
96
 
97
  <B>Meet the Team: Mixture of Experts Models</b>
98
 
99
+ This model is based on the original "Llama 3 Dark Planet 8B" (<a href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B-GGUF">GGUF</a> / <a href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B">SOURCE</a>) merge that has been "evolved" several times. Each "evolved"
100
+ version is then tested, if it is unique and/or removes certain negative attibutes and/or enhances certain positive attibutes, it is kept otherwise it is deleted.
101
+
102
+ This model contains the four ("b3","b4","r1" and "b6") best models from this process, with the very best as a "captain" of the "MOE" so to speak.
103
+
104
+ None of these versions have ever been released, but contain the "raw source DNA" of the original model.
105
+
106
+ This process was first explored in the <a href="https://huggingface.co/collections/DavidAU/d-au-wordstorm-10-part-series-incl-full-source-67257ba027f7e244222907fd">WORDSTORM Project</a>
107
 
108
  The mixture of experts is set at 2 experts, but you can use 3 or 4 too.
109