
WS - Dark Planet Wordstorm Project - Random Prune / Form.
Models using the org Dark Planet 8B formula, with random pruning / density, to create new Dark Planet versions with new abilities / generation.
Text Generation • 8B • Updated • 2.34k • 47Note The original Dark Planet in GGUF, with links to new 128k / 1 million context versions as well as expanded Dark Planet versions like DARKEST PLANET 16.5B and other versions too.
DavidAU/L3-Dark-Planet-8B
Text Generation • 8B • Updated • 12 • 7Note Original Dark Planet in full precision / source. Mergekit file included.
DavidAU/L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B-GGUF
Text Generation • 47B • Updated • 566 • 15Note The model shows the use of EIGHT of the "Dark Planet Wordstorm" ("cr2", "cr1", "r7", "r6", "b3", "b4", "r1" and "b6") models in a MOE (Mixture of experts) configuration. This allows you to use the power of up to 8 of these models at the same time.
DavidAU/L3.1-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-Uncensored-47B-GGUF
Text Generation • 47B • Updated • 472 • 3Note The model shows the use of EIGHT of the "Dark Planet Wordstorm" ("cr2", "cr1", "r7", "r6", "b3", "b4", "r1" and "b6") models in a MOE (Mixture of experts) configuration. This allows you to use the power of up to 8 of these models at the same time. This uses a Llama 3.1 model as a "base" in the MOE to extend context to 128k.
DavidAU/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B-GGUF
Text Generation • 25B • Updated • 209 • 3Note The model shows the use of FOUR of the "Dark Planet Wordstorm" ("b3","b4","r1" and "b6") models in a MOE (Mixture of experts) configuration. This allows you to use the power of up to 4 of these models at the same time.