Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
sl-alex
/
llama-7b-alpaca-stepwise-lora-embtuned
like
0
sl-alex/openai-prm800k-solutions-only
English
License:
apache-2.0
Model card
Files
Files and versions
Community
main
llama-7b-alpaca-stepwise-lora-embtuned
1 contributor
History:
6 commits
sl-alex
upload embedding and unembedding weights in a safetensors format
0bb8574
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
over 1 year ago
README.md
Safe
8.62 kB
Update README.md
over 1 year ago
adapter_config.json
Safe
477 Bytes
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
adapter_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
320 MB
LFS
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
added_tokens.json
Safe
129 Bytes
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
embed_tokens.pt
Safe
pickle
Detected Pickle imports (3)
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
262 MB
LFS
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
lm_head.pt
Safe
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.BFloat16Storage"
What is a pickle import?
262 MB
LFS
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
overlays.safetensors
Safe
524 MB
LFS
upload embedding and unembedding weights in a safetensors format
about 1 year ago
special_tokens_map.json
Safe
221 Bytes
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
tokenizer.model
Safe
500 kB
LFS
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago
tokenizer_config.json
Safe
727 Bytes
Upload QLoRA adapter, finetuned input/output embeddings, tokenizer config
over 1 year ago