Integrating with Langchain?
Hi BAAI team, great thanks for this exciting model.
Just wonder when will BGE wok with Langchain as the wechat article mentioned this possibility.
Best,
Hi, Thanks for your interest!
Currently, the BGE can be used in Langchain like this:
from langchain.embeddings import HuggingFaceInstructEmbeddings
encode_kwargs = {'normalize_embeddings': True}
model = HuggingFaceInstructEmbeddings(model_name='BAAI/bge-large-en',
embed_instruction="",
# retrieval passages for short query, using query_instruction, else set it ""
query_instruction="Represent this sentence for searching relevant passages: ",
encode_kwargs=encode_kwargs)
We will try to support different LLM tools in the future.
The HuggingFaceEmbeddings class from the langchain.embeddings.huggingface code relies on the sentence_transformers, could we use the HuggingFace's transformers library (if it's similar to the model used in sentence_transformers) to generate embeddings?
Now, the langchian supports bge models, and you can load bge easily with HuggingFaceBgeEmbeddings following: https://github.com/FlagOpen/FlagEmbedding#using-langchain .
If you want to use HuggingFaceEmbeddings, you need to add an instruction to queries manually before using them to generate embeddings.