Update README.md
Browse files
README.md
CHANGED
@@ -107,7 +107,7 @@ Please refer to the [README.md](https://github.com/llm-jp/llm-jp-tokenizer) of `
|
|
107 |
|
108 |
### Pre-training
|
109 |
|
110 |
-
The models have been pre-trained using a blend of the following
|
111 |
|
112 |
| Language | Dataset | Tokens|
|
113 |
|:---:|:---:|:---:|
|
|
|
107 |
|
108 |
### Pre-training
|
109 |
|
110 |
+
The models have been pre-trained using a blend of the following datasets.
|
111 |
|
112 |
| Language | Dataset | Tokens|
|
113 |
|:---:|:---:|:---:|
|