Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
nicholasKluge
/
Aira-OPT-125M
like
0
Text Generation
Transformers
PyTorch
Safetensors
nicholasKluge/instruct-aira-dataset
English
opt
alignment
instruction tuned
text generation
conversation
assistant
Carbon Emissions
text-generation-inference
Inference Endpoints
arxiv:
1803.05457
arxiv:
2109.07958
arxiv:
2203.09509
License:
other
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
0f0af99
Aira-OPT-125M
3 contributors
History:
27 commits
nicholasKluge
Update README.md
0f0af99
12 months ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
Aira_emissions.csv
778 Bytes
Upload Aira_emissions.csv with huggingface_hub
about 1 year ago
LICENSE.md
11.1 kB
Upload 4 files
about 1 year ago
README.md
6.07 kB
Update README.md
12 months ago
added_tokens.json
143 Bytes
Upload tokenizer
about 1 year ago
config.json
757 Bytes
Update config.json
about 1 year ago
generation_config.json
294 Bytes
Upload config
12 months ago
lm-evaluation-harness.ipynb
111 kB
Upload 4 files
about 1 year ago
merges.txt
456 kB
Upload tokenizer
about 1 year ago
model.safetensors
501 MB
LFS
Upload model.safetensors with huggingface_hub
about 1 year ago
optimizer.pt
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
1 GB
LFS
Upload optimizer.pt with huggingface_hub
about 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
501 MB
LFS
Upload OPTForCausalLM
about 1 year ago
rng_state.pt
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
What is a pickle import?
6.25 kB
LFS
Upload rng_state.pt with huggingface_hub
about 1 year ago
scheduler.pt
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
1 kB
LFS
Upload scheduler.pt with huggingface_hub
about 1 year ago
special_tokens_map.json
625 Bytes
Upload tokenizer
about 1 year ago
tokenizer.json
2.11 MB
Upload tokenizer
about 1 year ago
tokenizer_config.json
1.32 kB
Upload tokenizer
about 1 year ago
training_stats.parquet
2.35 kB
LFS
Upload training_stats.parquet with huggingface_hub
about 1 year ago
vocab.json
798 kB
Upload tokenizer
about 1 year ago