Minor edits to README.md
Browse files
README.md
CHANGED
@@ -17,8 +17,11 @@ language:
|
|
17 |
|
18 |
# SEA-LION-Pile
|
19 |
|
20 |
-
SEA-LION is a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
|
21 |
-
This repository contains the cleaned mC4 portion of the
|
|
|
|
|
|
|
22 |
|
23 |
## Dataset Details
|
24 |
|
@@ -47,7 +50,9 @@ SEA-LION was trained on 980B tokens of the following data:
|
|
47 |
| RedPajama - ArXiv | 30.6B | 1 | 30.6B | 3.12% |
|
48 |
|
49 |
|
50 |
-
###
|
|
|
|
|
51 |
|
52 |
- [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb)
|
53 |
- [the Stack (Python, Javascript, Shell, SQL, Markdown)](https://huggingface.co/datasets/bigcode/the-stack-dedup)
|
|
|
17 |
|
18 |
# SEA-LION-Pile
|
19 |
|
20 |
+
SEA-LION-Pile is the pretraining data set for SEA-LION, a collection of Large Language Models (LLMs) which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
|
21 |
+
This repository contains the cleaned mC4 portion of the SEA-LION-Pile.
|
22 |
+
|
23 |
+
For the remainder of the SEA-LION-Pile dataset, they may be downloaded from the links provided below.
|
24 |
+
|
25 |
|
26 |
## Dataset Details
|
27 |
|
|
|
50 |
| RedPajama - ArXiv | 30.6B | 1 | 30.6B | 3.12% |
|
51 |
|
52 |
|
53 |
+
### Additional SEA-LION-Pile (non-mC4) Data Sources
|
54 |
+
|
55 |
+
This section contains the links to the additional datasets that form the SEA-LION-Pile.
|
56 |
|
57 |
- [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb)
|
58 |
- [the Stack (Python, Javascript, Shell, SQL, Markdown)](https://huggingface.co/datasets/bigcode/the-stack-dedup)
|