intfloat osanseviero commited on
Commit
2696861
1 Parent(s): b9fbfc1

Add link to new paper (#1)

Browse files

- Add link to new paper (040d0121a88973cae67c013d62b14ef9a65ee4ec)


Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -5383,6 +5383,9 @@ license: mit
5383
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
5384
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
5385
 
 
 
 
5386
  This model has 24 layers and the embedding size is 1024.
5387
 
5388
  ## Usage
 
5383
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
5384
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
5385
 
5386
+ [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
5387
+ Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
5388
+
5389
  This model has 24 layers and the embedding size is 1024.
5390
 
5391
  ## Usage