NeMo
PyTorch
English
seq2seq
masked language modeling
MaximumEntropy commited on
Commit
91eb318
1 Parent(s): f967637

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -61,7 +61,7 @@ python megatron_t5_eval.py \
61
  --tensor_model_parallel_size 2
62
  ```
63
 
64
- The script will automatically replace all <mask> tokens with the appropriate sentinel tokens used while pre-training and attempt to fill them in autoregressively with greedy decoding.
65
 
66
 
67
  *Expected Response*:
@@ -80,7 +80,7 @@ The script will automatically replace all <mask> tokens with the appropriate sen
80
 
81
  - prompt: The provided raw prompt as input
82
  - completion:
83
- - text: The final generated text from the model along with special/sentinel tokens besides "</s>"
84
  - tokens: Each individual subword that is generated along with its log-probability.
85
  - masked_input: The original raw prompt with <mask> replaced with appropriate sentinel tokens.
86
 
 
61
  --tensor_model_parallel_size 2
62
  ```
63
 
64
+ The script will automatically replace all \<mask\> tokens with the appropriate sentinel tokens used while pre-training and attempt to fill them in autoregressively with greedy decoding.
65
 
66
 
67
  *Expected Response*:
 
80
 
81
  - prompt: The provided raw prompt as input
82
  - completion:
83
+ - text: The final generated text from the model along with special/sentinel tokens besides \</s\>
84
  - tokens: Each individual subword that is generated along with its log-probability.
85
  - masked_input: The original raw prompt with <mask> replaced with appropriate sentinel tokens.
86