|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- xquad |
|
language: |
|
- multilingual |
|
library_name: transformers |
|
tags: |
|
- cross-lingual |
|
- exctractive-question-answering |
|
metrics: |
|
- f1 |
|
- exact_match |
|
--- |
|
|
|
Best-performing "mBERT-qa-en, skd" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134). |
|
|
|
Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper. |
|
|