Update README.md
Browse files
README.md
CHANGED
@@ -38,15 +38,15 @@ More precisely, inputs are sequences of continuous text of a certain length and
|
|
38 |
|
39 |
This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.
|
40 |
|
41 |
-
## How to use GPT2-BioPT with HuggingFace
|
42 |
-
|
43 |
```
|
44 |
from transformers import pipeline
|
45 |
|
46 |
-
chef = pipeline('text-generation',model=
|
47 |
|
48 |
result = chef('O paciente chegou no hospital')[0]['generated_text']
|
49 |
print(result)
|
50 |
```
|
|
|
51 |
## Citation
|
52 |
-
*
|
|
|
38 |
|
39 |
This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.
|
40 |
|
41 |
+
## How to use GPT2-BioPT with HuggingFace
|
|
|
42 |
```
|
43 |
from transformers import pipeline
|
44 |
|
45 |
+
chef = pipeline('text-generation',model="pucpr/gpt2-bio-pt", tokenizer="pucpr/gpt2-bio-pt",config={'max_length':800}, framwork="tf")
|
46 |
|
47 |
result = chef('O paciente chegou no hospital')[0]['generated_text']
|
48 |
print(result)
|
49 |
```
|
50 |
+
|
51 |
## Citation
|
52 |
+
*soon*
|