--- license: apache-2.0 datasets: - xquad language: - multilingual library_name: transformers tags: - cross-lingual - exctractive-question-answering metrics: - f1 - exact_match --- Best-performing "mBERT-qa-en, skd, mAP@k" model from the paper _Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation_. More info at the official GitHub repository: https://github.com/ccasimiro88/self-distillation-gxlt-qa