Is there a way to add to the feed my domain specific articles to current trained model?
I have domain specific articles on my website for example.: superhero movies (non generalist text).
I do not want to fully retrain FLAN-T5, but I want to do is rather adding my articles in to the knowledgebase of the model.
Maybe in a way like smartphone keyboards federated learning to recommend next words or do spelling corrections personalized for each user (I am open to any solutions if ther are anything).
In my case I would like to feed the model with my article about super heroes so it can answer questions about the whole website that not normally searchable with traditional techniques like.:
- In Batman VS Superman who wins the final battle and how?
- What is the main plot in Avengers: Infinity War?
The goal would be to not to spend 10-20M USD on training the whole model. But maybe a few 1000 USD.
Hi Stromal,
Hope you are doing well!
I would say that you don't need to do a fine-tuning but only an in-context semantic search using tools such as LangChain or Llama-index.
With this, the software will add a context to the prompt by doing a similarity search on embedding and ask the model a question like "based on the following text: {text}. {Question}".
But say if I were to fine-tune Flan-T5 using local data, how would I go about it?
Note: I've already tried the semantic search approach using langchain and really not satisfied with the results
@naman-trilogy can you share why you were not satisfied with the results (just curious as I'm interested in the same question - do we need to fine tune, or can we just use LangChain to be able to effectively query custom data sources).