can i use mistral as embedding model?
hi, i am new bee in this field. i want to know can i use mistral as embedding model? as sentence transformer do.
Yes, technically it's possible. But don't forget that Mistral is a decoder language model, i.e. it is trained to predict following words from left to right only. So, due to the unidirectional training objective, its embeddings may lack rich linguistic information compared to bidirectional language models (such as BERT, S-BERT, OpenAI's text embeddings, etc.)
It may hallucinate the model?
I'm a beginner in creating a RAG pipeline.
I've fine tuned this model. How can I use the fine tuned Mistral model to create a vector embeddings? If I use Bert, then the vector DB might not be containing the information from the data I won't to.
If you want to have an embedding model that is specialised in your data/domain, you can consider finetuning sbert
@yaya-sy You've mentioned that we could use decoder-only LLMs like Mistral to perform tasks such as STS. Why would someone use it over BERT based which are optimized for STS tasks? Also I've found https://docs.mistral.ai/guides/embeddings/ which is embeddings API supported by MistralAI, is this fine-tuned in a way to perform well on STS?
@RoiandDae
I said that it's technically possible to use Decoder LLMs as embeddings models. But i also said that it's probably not the best choice compared to the states of the art embeddings models (OpenAI Embeddings or Mistral Embeddings as you said).
What you mean by "STS" ?
@yaya-sy STS refers to Semantic Text Similarity tasks. I understand your point, but I'm curious about whether decoder-only models are limited to text generation and similar tasks, or if they have also been effectively fine-tuned for STS tasks. This would suggest that decoder-only models are versatile and not confined to specific tasks like text generation and next-word prediction.
@RoiandDae Yeah, i see. You can give it a try.