Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
pszemraj
/
led-large-book-summary
like
97
Summarization
Transformers
PyTorch
Safetensors
kmfoda/booksum
English
doi:10.57967/hf/0101
led
text2text-generation
summary
longformer
booksum
long-document
long-form
Eval Results
Inference Endpoints
arxiv:
2105.08209
License:
apache-2.0
License:
bsd-3-clause
Model card
Files
Files and versions
Community
20
Train
Deploy
Use this model
477d608
led-large-book-summary
3 contributors
History:
44 commits
pszemraj
Delete training_args.bin
477d608
over 1 year ago
.gitattributes
Safe
1.23 kB
Adding `safetensors` variant of this model (#14)
over 1 year ago
.gitignore
Safe
13 Bytes
update model with additional 1.8ish epochs training
over 2 years ago
README.md
Safe
26.1 kB
Add verifyToken field to verify evaluation results are produced by Hugging Face's automatic model evaluator (#11)
almost 2 years ago
config.json
Safe
1.44 kB
add additional 2-epoch checkpoint, better regularization
over 2 years ago
ds_config_zero2.json
Safe
895 Bytes
Upload ds_config_zero2.json
over 2 years ago
latest
Safe
14 Bytes
add additional 2-epoch checkpoint, better regularization
over 2 years ago
merges.txt
Safe
456 kB
add tokenizer
almost 3 years ago
model.safetensors
Safe
1.84 GB
LFS
Adding `safetensors` variant of this model (#14)
over 1 year ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
1.84 GB
LFS
add additional 2-epoch checkpoint, better regularization
over 2 years ago
rng_state_0.pth
Safe
14.4 kB
LFS
add additional 2-epoch checkpoint, better regularization
over 2 years ago
special_tokens_map.json
Safe
772 Bytes
add tokenizer
almost 3 years ago
tokenizer.json
Safe
2.11 MB
update model with additional 1.8ish epochs training
over 2 years ago
tokenizer_config.json
Safe
1.32 kB
add additional 2-epoch checkpoint, better regularization
over 2 years ago
trainer_state.json
Safe
7.24 kB
add additional 2-epoch checkpoint, better regularization
over 2 years ago
vocab.json
Safe
798 kB
add tokenizer
almost 3 years ago