Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
dunzhang
/
stella_en_400M_v5
like
117
Follow
StellaEncoder
23
Sentence Similarity
sentence-transformers
PyTorch
Safetensors
Transformers
new
feature-extraction
mteb
custom_code
Eval Results
text-embeddings-inference
Inference Endpoints
arxiv:
2205.13147
License:
mit
Model card
Files
Files and versions
Community
24
Train
Deploy
Use this model
main
stella_en_400M_v5
5 contributors
History:
13 commits
infgrad
michaelfeil
Update infinity example (
#23
)
24e2e1f
verified
8 days ago
1_Pooling
Upload 17 files
4 months ago
2_Dense
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_1024
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_2048
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_256
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_4096
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_6144
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_768
Adding `safetensors` variant of this model (#2)
4 months ago
2_Dense_8192
Adding `safetensors` variant of this model (#2)
4 months ago
.gitattributes
1.52 kB
initial commit
4 months ago
README.md
170 kB
Update infinity example (#23)
8 days ago
config.json
892 Bytes
Upload 10 files
4 months ago
config_sentence_transformers.json
397 Bytes
Set 1024 as default dim, update usage snippets, store prompts in config (#1)
4 months ago
configuration.py
7.13 kB
Upload 10 files
4 months ago
model.safetensors
1.74 GB
LFS
Adding `safetensors` variant of this model (#2)
4 months ago
modeling.py
57.5 kB
Update modeling.py (#19)
about 2 months ago
modules.json
316 Bytes
Set 1024 as default dim, update usage snippets, store prompts in config (#1)
4 months ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
1.74 GB
LFS
Upload 10 files
4 months ago
sentence_bert_config.json
51 Bytes
Upload 10 files
4 months ago
special_tokens_map.json
695 Bytes
Upload 10 files
4 months ago
tokenizer.json
712 kB
Upload 10 files
4 months ago
tokenizer_config.json
1.38 kB
Upload 10 files
4 months ago
vocab.txt
232 kB
Upload 10 files
4 months ago