File size: 263 Bytes
1c23bbf |
1 2 |
PreTrainedTokenizer(name_or_path='', vocab_size=38, model_max_len=1000000000000000019884624838656, is_fast=False, padding_side='right', truncation_side='right', special_tokens={'bos_token': '<s>', 'eos_token': '</s>', 'unk_token': '[UNK]', 'pad_token': '[PAD]'})
|