Update README.md
Browse files
README.md
CHANGED
@@ -3,4 +3,4 @@ license: mit
|
|
3 |
---
|
4 |
# Distilled-RoBERTa
|
5 |
|
6 |
-
The DistilBERT model is
|
|
|
3 |
---
|
4 |
# Distilled-RoBERTa
|
5 |
|
6 |
+
The DistilBERT model is a [RoBERTa](https://huggingface.co/deepset/roberta-base-squad2-distilled) model, which is trained on the SQuAD 2.0 training set, fine-tuned on the [NewsQA](https://huggingface.co/datasets/lucadiliello/newsqa) dataset.
|