library_name: transformers | |
tags: [] | |
# Translation and Fusion Improves Zero-shot Cross-lingual Information Extraction | |
## Summary | |
We propose TransFusion, a framework in which models are fine-tuned to use English translations of low-resource language data, enabling more precise predictions through annotation fusion. | |
Based on TransFusion, we introduce GoLLIE-TF, a cross-lingual instruction-tuned LLM for IE tasks, designed to close the performance gap between high and low-resource languages. | |
- π Paper: [Translation and Fusion Improves Zero-shot Cross-lingual Information Extraction](https://arxiv.org/abs/2305.13582) | |
- π€ Model: [GoLLIE-7B-TF](https://huggingface.co/ychenNLP/GoLLIE-7B-TF) | |
- π Example Jupyter Notebooks: [GoLLIE-TF Notebooks](notebooks/tf.ipynb) |