Update README.md
Browse files
README.md
CHANGED
@@ -56,7 +56,7 @@ This model was finetuned for 26 billion tokens over 408,000 steps on a TPU v3-8
|
|
56 |
|
57 |
## Intended Use and Limitations
|
58 |
|
59 |
-
BERTIN-GPT-J-
|
60 |
|
61 |
### How to use
|
62 |
|
|
|
56 |
|
57 |
## Intended Use and Limitations
|
58 |
|
59 |
+
BERTIN-GPT-J-6B learns an inner representation of the Spanish language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt.
|
60 |
|
61 |
### How to use
|
62 |
|