raquelsilveira
commited on
Commit
•
00775af
1
Parent(s):
0c18fdb
Update README.md
Browse files
README.md
CHANGED
@@ -44,4 +44,8 @@ from transformers import AutoModel # or BertModel, for BERT without pretraining
|
|
44 |
|
45 |
model = AutoModelForPreTraining.from_pretrained('raquelsilveira/legalbertpt_sc')
|
46 |
tokenizer = AutoTokenizer.from_pretrained('raquelsilveira/legalbertpt_sc')
|
47 |
-
```
|
|
|
|
|
|
|
|
|
|
44 |
|
45 |
model = AutoModelForPreTraining.from_pretrained('raquelsilveira/legalbertpt_sc')
|
46 |
tokenizer = AutoTokenizer.from_pretrained('raquelsilveira/legalbertpt_sc')
|
47 |
+
```
|
48 |
+
|
49 |
+
## Cite as
|
50 |
+
|
51 |
+
Raquel Silveira, Caio Ponte, Vitor Almeida, Vládia Pinheiro, and Vasco Furtado. 2023. LegalBert-pt: A Pretrained Language Model for the Brazilian Portuguese Legal Domain. In Intelligent Systems: 12th Brazilian Conference, BRACIS 2023, Belo Horizonte, Brazil, September 25–29, 2023, Proceedings, Part III. Springer-Verlag, Berlin, Heidelberg, 268–282. https://doi.org/10.1007/978-3-031-45392-2_18
|