Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
taufeeque
/
wiki-finetuned-pythia-70m-deduped
like
0
Text Generation
Transformers
PyTorch
gpt_neox
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
52f443b
wiki-finetuned-pythia-70m-deduped
/
special_tokens_map.json
taufeeque
add tokenizer
8dc3a1c
almost 2 years ago
raw
Copy download link
history
blame
Safe
99 Bytes
{
"bos_token"
:
"<|endoftext|>"
,
"eos_token"
:
"<|endoftext|>"
,
"unk_token"
:
"<|endoftext|>"
}