munish0838 commited on
Commit
1413113
1 Parent(s): fc35e0d

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +50 -0
README.md ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ library_name: transformers
5
+ license: apache-2.0
6
+ base_model: ibm-granite/granite-3.0-3b-a800m-base
7
+ tags:
8
+ - axolotl
9
+ - moe
10
+ - roleplay
11
+ model-index:
12
+ - name: MoE_Girl_400MA_1BT
13
+ results: []
14
+
15
+ ---
16
+
17
+ [![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
18
+
19
+
20
+ # QuantFactory/MoE-Girl-800MA-3BT-GGUF
21
+ This is quantized version of [allura-org/MoE-Girl-800MA-3BT](https://huggingface.co/allura-org/MoE-Girl-800MA-3BT) created using llama.cpp
22
+
23
+ # Original Model Card
24
+
25
+
26
+ # MoE Girl 400mA 1bT
27
+ ![made with hassakuXL in sd-webui-forge](moe-girl-800-3.png)
28
+ A roleplay-centric finetune of IBM's Granite 3.0 3B-A800M. LoRA finetune trained locally, whereas the others were FFT; while this results in less uptake of training data, it should also mean less degradation in Granite's core abilities, making it potentially easier to use for general-purpose tasks.
29
+
30
+ ## Disclaimer
31
+ PLEASE do not expect godliness out of this, it's a model with _800 million_ active parameters. Expect something more akin to GPT-3 (the original, not GPT-3.5.)
32
+ (Furthermore, this version is by a less experienced tuner; it's my first finetune that actually has decent-looking graphs, I don't really know what I'm doing yet!)
33
+ ## Quants
34
+ Soon:tm:
35
+
36
+ ## Prompting
37
+ Use ChatML.
38
+ ```
39
+ <|im_start|>system
40
+ You are a helpful assistant who talks like a pirate.<|im_end|>
41
+ <|im_start|>user
42
+ Hello there!<|im_end|>
43
+ <|im_start|>assistant
44
+ Yarr harr harr, me matey!<|im_end|>
45
+ ```
46
+
47
+ ## Thanks
48
+ Special thanks to the members of Allura for testing and emotional support, as well as the creators of all the datasets that were used in the Special Sauce used to train this model. I love you all <3 - Fizz
49
+
50
+ Thanks to Fizz for her work on the MoE Girl series, Auri for her counsel, and all of Allura for being great friends and supporting my learning process. - inflatebot