File size: 509 Bytes
d5607a3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
license: mit
language:
- en
pipeline_tag: text2text-generation
tags:
- legal
---
# flan-t5-kelm-tekgen-kg-w-context-small
Google's Flan T5 model ([flan-t5-small](https://huggingface.co/google/flan-t5-small)) trained over KG triples from the [KELM TEKGEN Corpus](https://github.com/google-research-datasets/KELM-corpus#part-1-tekgen-training-corpus) using the training method used for [KGT-5](https://huggingface.co/spaces/apoorvumang/kgt5) along with additional context supplied alongside the prompts.
|