Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
relu
/
essay-generation
like
0
Follow
Re?LU
2
Text Generation
Transformers
PyTorch
gptj
Inference Endpoints
License:
openrail
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
essay-generation
1 contributor
History:
12 commits
nuxlear
Upload tokenizer
9206dc1
almost 2 years ago
.gitattributes
Safe
1.48 kB
initial commit
almost 2 years ago
README.md
Safe
26 Bytes
initial commit
almost 2 years ago
config.json
Safe
1.01 kB
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (4)
"torch.HalfStorage"
,
"torch.BoolStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
10 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00001-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
24.9 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BoolStorage"
What is a pickle import?
2.42 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00002-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
24.9 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00003-of-00003.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
24.7 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model.bin.index.json
Safe
25.8 kB
Upload GPTJForCausalLM
almost 2 years ago
special_tokens_map.json
Safe
125 Bytes
Upload tokenizer
almost 2 years ago
tokenizer.json
Safe
3.49 MB
Upload tokenizer
almost 2 years ago
tokenizer_config.json
Safe
335 Bytes
Upload tokenizer
almost 2 years ago