Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ai-forever
/
sbert_large_nlu_ru
like
61
Feature Extraction
Transformers
PyTorch
Safetensors
Russian
bert
PyTorch
Transformers
text-embeddings-inference
Inference Endpoints
Model card
Files
Files and versions
Community
9
Train
Deploy
Use this model
f2be770
sbert_large_nlu_ru
4 contributors
History:
26 commits
ai-forever
Update tokenizer_config.json
f2be770
verified
6 months ago
1_Pooling
Upload 10 files
6 months ago
.gitattributes
Safe
445 Bytes
Upload 10 files
6 months ago
README.md
Safe
1.97 kB
Upload README.md
6 months ago
config.json
Safe
863 Bytes
Upload 10 files
6 months ago
config_sentence_transformers.json
Safe
195 Bytes
Upload 10 files
6 months ago
model.safetensors
Safe
1.71 GB
LFS
Upload 10 files
6 months ago
modules.json
Safe
349 Bytes
Upload 10 files
6 months ago
sentence_bert_config.json
Safe
53 Bytes
Upload 10 files
6 months ago
special_tokens_map.json
Safe
125 Bytes
Upload 10 files
6 months ago
tokenizer.json
Safe
3.71 MB
Upload 10 files
6 months ago
tokenizer_config.json
Safe
1.24 kB
Update tokenizer_config.json
6 months ago
vocab.txt
Safe
1.78 MB
Upload 10 files
6 months ago