File size: 550 Bytes
778cfbc
 
 
 
 
 
 
 
 
 
f9c468f
 
 
dea5c5d
 
fdd398f
f63cf20
fdd398f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: apache-2.0
datasets:
- xquad
language:
- multilingual
library_name: transformers
tags:
- cross-lingual
- exctractive-question-answering
metrics:
- f1
- exact_match
---

Best-performing "mBERT-qa-en, skd" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134).

Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper.