Transformers
PyTorch
bert
YuxinJiang commited on
Commit
91b5cfc
·
1 Parent(s): 1a01ec6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -12,7 +12,10 @@ license: apache-2.0
12
  [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/deep-continuous-prompt-for-contrastive-1/semantic-textual-similarity-on-sts16)](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts16?p=deep-continuous-prompt-for-contrastive-1)
13
  [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/deep-continuous-prompt-for-contrastive-1/semantic-textual-similarity-on-sts15)](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts15?p=deep-continuous-prompt-for-contrastive-1)
14
 
15
- This repository contains the code for our EMNLP 2022 paper [Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning](https://arxiv.org/abs/2203.06875v2). Our code is modified based on [SimCSE](https://github.com/princeton-nlp/SimCSE) and [P-tuning v2](https://github.com/THUDM/P-tuning-v2/). Here we would like to sincerely thank them for their excellent works.
 
 
 
16
 
17
  We release our best model checkpoint which acquires **Top 1** results on four STS tasks:
18
 
@@ -25,9 +28,6 @@ We release our best model checkpoint which acquires **Top 1** results on four ST
25
 
26
  If you have any questions, feel free to raise an issue.
27
 
28
- [//]: <## Architecture>
29
- [//]: <We add multi-layer trainable dense vectors as soft prompts to the input sequence, which means the input embeddings as well as each layer's hidden embeddings of prompts are optimized (the orange blocks). Note that all parameters of the pre-trained model are frozen (the blue blocks), thus reducing the number of tunable parameters to around **0.1\%**. The [CLS] token embedding of the last layer is selected as the sentence representation. The contrastive framework is the same as SimCSE.>
30
-
31
 
32
  ## Setups
33
 
 
12
  [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/deep-continuous-prompt-for-contrastive-1/semantic-textual-similarity-on-sts16)](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts16?p=deep-continuous-prompt-for-contrastive-1)
13
  [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/deep-continuous-prompt-for-contrastive-1/semantic-textual-similarity-on-sts15)](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts15?p=deep-continuous-prompt-for-contrastive-1)
14
 
15
+ arXiv link: https://arxiv.org/abs/2203.06875v2
16
+ To be published in [**EMNLP 2022**](https://2022.naacl.org/)
17
+
18
+ Our code is modified based on [SimCSE](https://github.com/princeton-nlp/SimCSE) and [P-tuning v2](https://github.com/THUDM/P-tuning-v2/). Here we would like to sincerely thank them for their excellent works.
19
 
20
  We release our best model checkpoint which acquires **Top 1** results on four STS tasks:
21
 
 
28
 
29
  If you have any questions, feel free to raise an issue.
30
 
 
 
 
31
 
32
  ## Setups
33