ai-forever commited on
Commit
5d24d1d
1 Parent(s): 685ab0b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -21,7 +21,6 @@ An extensive dataset with “artificial” errors was taken as a training corpus
21
  - [SAGE library announcement](https://youtu.be/yFfkV0Qjuu0), DataFest 2023
22
  - [Paper about synthetic error generation methods](https://www.dialog-21.ru/media/5914/martynovnplusetal056.pdf), Dialogue 2023
23
  - [Paper about SAGE and our best solution](https://arxiv.org/abs/2308.09435), Review EACL 2024
24
- - Path to model = "ai-forever/FRED-T5-large-spell"
25
 
26
  ### Examples
27
  *Examples are given with default generation parameters
@@ -88,7 +87,7 @@ We compare our solution with both open automatic spell checkers and the ChatGPT
88
  ```python
89
  from transformers import T5ForConditionalGeneration, AutoTokenizer
90
 
91
- path_to_model = "<path_to_model>"
92
 
93
  model = T5ForConditionalGeneration.from_pretrained(path_to_model)
94
  tokenizer = AutoTokenizer.from_pretrained(path_to_model, eos_token="</s>")
 
21
  - [SAGE library announcement](https://youtu.be/yFfkV0Qjuu0), DataFest 2023
22
  - [Paper about synthetic error generation methods](https://www.dialog-21.ru/media/5914/martynovnplusetal056.pdf), Dialogue 2023
23
  - [Paper about SAGE and our best solution](https://arxiv.org/abs/2308.09435), Review EACL 2024
 
24
 
25
  ### Examples
26
  *Examples are given with default generation parameters
 
87
  ```python
88
  from transformers import T5ForConditionalGeneration, AutoTokenizer
89
 
90
+ path_to_model = "ai-forever/FRED-T5-large-spell"
91
 
92
  model = T5ForConditionalGeneration.from_pretrained(path_to_model)
93
  tokenizer = AutoTokenizer.from_pretrained(path_to_model, eos_token="</s>")