Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bprateek
/
product_description_generator
like
1
Text2Text Generation
Transformers
PyTorch
TensorBoard
t5
Generated from Trainer
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
1
Train
Deploy
Use this model
refs/pr/1
product_description_generator
1 contributor
History:
9 commits
SFconvertbot
Adding `safetensors` variant of this model
3e86c02
over 1 year ago
runs
End of training
over 1 year ago
.gitattributes
1.31 kB
Adding `safetensors` variant of this model
over 1 year ago
.gitignore
13 Bytes
Upload 8 files
over 1 year ago
README.md
2.45 kB
update model card README.md
over 1 year ago
config.json
1.47 kB
End of training
over 1 year ago
dls_eng_Batch4.pkl
Unsafe
pickle
Detected Pickle imports (28)
"fastcore.transform.Pipeline"
,
"tokenizers.Tokenizer"
,
"torch.device"
,
"__builtin__.getattr"
,
"pathlib.PosixPath"
,
"numpy.dtype"
,
"fastai.text.data.LMTensorText"
,
"__builtin__.unicode"
,
"fastai.data.core.DataLoaders"
,
"fastai.text.data._maybe_first"
,
"fastai.text.data.LMDataLoader"
,
"random.Random"
,
"fastcore.imports.noop"
,
"fastai.data.core.TfmdLists"
,
"_codecs.encode"
,
"numpy.core.multiarray._reconstruct"
,
"__main__.TransformersTokenizer"
,
"__builtin__.tuple"
,
"fastai.torch_core.Chunks"
,
"tokenizers.models.Model"
,
"transformers.models.gpt2.tokenization_gpt2_fast.GPT2TokenizerFast"
,
"torch.Tensor"
,
"fastcore.foundation.L"
,
"fastcore.xtras.ReindexCollection"
,
"fastai.data.load._wif"
,
"numpy.core.multiarray.scalar"
,
"fastai.data.load._FakeLoader"
,
"numpy.ndarray"
How to fix it?
138 MB
LFS
Upload 8 files
over 1 year ago
generation_config.json
112 Bytes
End of training
over 1 year ago
history_epoch1.csv
382 Bytes
LFS
End of training
over 1 year ago
model.safetensors
242 MB
LFS
Adding `safetensors` variant of this model
over 1 year ago
pytorch_model.bin
242 MB
LFS
End of training
over 1 year ago
special_tokens_map.json
2.2 kB
End of training
over 1 year ago
tokenizer.json
2.42 MB
End of training
over 1 year ago
tokenizer_config.json
2.32 kB
End of training
over 1 year ago
training_args.bin
3.77 kB
LFS
End of training
over 1 year ago