Update README.md
Browse files
README.md
CHANGED
@@ -113,7 +113,7 @@ Please refer to the [README.md](https://github.com/llm-jp/llm-jp-tokenizer) of `
|
|
113 |
|
114 |
### Pre-training
|
115 |
|
116 |
-
The models have been pre-trained using a blend of the following
|
117 |
|
118 |
| Language | Dataset | Tokens|
|
119 |
|:---:|:---:|:---:|
|
|
|
113 |
|
114 |
### Pre-training
|
115 |
|
116 |
+
The models have been pre-trained using a blend of the following datasets.
|
117 |
|
118 |
| Language | Dataset | Tokens|
|
119 |
|:---:|:---:|:---:|
|