Model sans facts?
Will it be possible to create/train a model that doesn't train with facts. But only on language structure. For instance:
Instead of training using: "Issac Newton formulated the theory of Gravity".
Can't we train using: "<PERSON_NAME> formulated the theory of <CONCEPT>"
If possible, I think we can still capture the language structure but dramatically reduce the model size?
That would a very interesting idea! Thank you!
An issue is that there might be inconsistencies between training and inference. E.g. the user might enter a prompt that includes facts, such as "Issac Newton formulated the theory of", , and the model here would answer " ", which however, might not be what the user expects. So we might need additional mechanisms to handle the factual knowledge part, e.g. retrofit GPT-J.
Thank you again for your suggestion. We would definitely look into this direction!
@JEU007 thanks for replying! Missed the notification!
I was thinking along these lines for a model thats significantly smaller but can do two things really well: "Language semantics & say do JS scripts". If this sort of a model can run on a commercial grade GPU, this would be wild!
Because then it opens up for natural language automation: This model + something like ScriptKit will do almost anything that can be done with computers & a bunch of scripts!