tau
/

Transformers
English
tau/sled
Inference Endpoints
t5-v1_1-large-sled / tokenizer_config.json
maorivgi
initial commit
3d0a008
raw
history blame contribute delete
113 Bytes
{
"tokenizer_class": "SledTokenizer",
"base_tokenizer": "google/t5-v1_1-large",
"model_max_length": 16384
}