nomic-embed-text-v1-unsupervised: A Reproducible Long Context (8192) Text Embedder

nomic-embed-text-v1-unsupervised is 8192 context length text encoder. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the final model. The purpose of releasing this checkpoint is to open-source training artifacts from our Nomic Embed Text tech report here

If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v1.

Join the Nomic Community

Downloads last month
945
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for nomic-ai/nomic-embed-text-v1-unsupervised

Quantizations
1 model

Spaces using nomic-ai/nomic-embed-text-v1-unsupervised 2

Collection including nomic-ai/nomic-embed-text-v1-unsupervised

Evaluation results