dandanw commited on
Commit
41d6980
·
1 Parent(s): 153899f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -1,3 +1,35 @@
1
  ---
2
  license: bigscience-bloom-rail-1.0
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: bigscience-bloom-rail-1.0
3
+ language:
4
+ - en
5
+ - sv
6
  ---
7
+
8
+ # Model Summary
9
+
10
+ This is a base causal model extended from [bigscience/bloomz-3b](https://huggingface.co/bigscience/bloomz-3b).
11
+
12
+ * Model size: 3.02B (~20M more than the base model)
13
+ * The tokenizer is extended to support Swedish language. Additional 8068 of tokens trained from Swedish Wiki and OSCAR have been added. The embedding layer is therefore extended too.
14
+ * The embedding layer and self-attention query_key_value layers are re-trained on mixed English and Swedish corpuses.
15
+
16
+ # Intended Use
17
+ This model is being created in order to enable **using Swedish and English on LLMs** to cover public research and business use cases. LLMs are intended to be used for language generation or as a pretrained base model.
18
+ **It needs to be further fine-tuned for specific tasks.**
19
+
20
+ The model inherits bigscience-bloom-rail-1.0 license from the base model. It shall **NOT** be used in bad purposes. For use restrictions, please check out [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license) Appendix A.
21
+
22
+ # Training Corpuses:
23
+
24
+ The model is re-trained with ~800M Swedish tokens and ~260M English tokens.
25
+
26
+ * [olm/wikipedia](https://huggingface.co/datasets/olm/wikipedia)
27
+ * [oscar](https://huggingface.co/datasets/oscar)
28
+ * [sbx/superlim-2](https://huggingface.co/datasets/sbx/superlim-2)
29
+ * [Gabriel/xsum_swe](https://huggingface.co/datasets/Gabriel/xsum_swe)
30
+
31
+ # Notes:
32
+ * Since the model is only re-trained with Swedish and English. It seems only Swedish and English capabilities are retained. If you want to re-enable the capabilities of other languages, you will need to re-train it with the specific language.
33
+ * You might notice the base model is bloomz-3b, which is a intruction fine-tuned version of bloom. After this re-train, it seems to lose the instruction capability as well. So it is now simply a base causal model which could speak Swedish.
34
+
35
+