Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
google
/
flan-t5-xl
like
468
Follow
Google
4,346
Text2Text Generation
Transformers
PyTorch
google-tensorflow
TensorFlow
JAX
Safetensors
10 datasets
5 languages
t5
text-generation-inference
Inference Endpoints
arxiv:
2210.11416
arxiv:
1910.09700
License:
apache-2.0
Model card
Files
Files and versions
Community
26
Train
Deploy
Use this model
refs/pr/1
flan-t5-xl
7 contributors
History:
11 commits
mrm8488
Fix model name
6872873
about 2 years ago
.gitattributes
1.43 kB
initial commit
about 2 years ago
README.md
11 kB
Fix model name
about 2 years ago
config.json
1.48 kB
Upload FlaxT5ForConditionalGeneration
about 2 years ago
flax_model-00001-of-00002.msgpack
9.97 GB
LFS
Upload FlaxT5ForConditionalGeneration
about 2 years ago
flax_model-00002-of-00002.msgpack
1.43 GB
LFS
Upload FlaxT5ForConditionalGeneration
about 2 years ago
flax_model.msgpack.index.json
51.2 kB
Upload FlaxT5ForConditionalGeneration
about 2 years ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
9.45 GB
LFS
Upload T5ForConditionalGeneration
about 2 years ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
1.95 GB
LFS
Upload T5ForConditionalGeneration
about 2 years ago
pytorch_model.bin.index.json
50.8 kB
Upload T5ForConditionalGeneration
about 2 years ago
special_tokens_map.json
2.2 kB
Upload tokenizer
about 2 years ago
spiece.model
792 kB
LFS
Upload tokenizer
about 2 years ago
tf_model-00001-of-00002.h5
9.97 GB
LFS
Upload TFT5ForConditionalGeneration
about 2 years ago
tf_model-00002-of-00002.h5
1.43 GB
LFS
Upload TFT5ForConditionalGeneration
about 2 years ago
tf_model.h5.index.json
68.5 kB
Upload TFT5ForConditionalGeneration
about 2 years ago
tokenizer.json
2.42 MB
Upload tokenizer
about 2 years ago
tokenizer_config.json
2.54 kB
Upload tokenizer
about 2 years ago