Abstract
Recent state-of-the-art natural language understanding mod-els, such as BERT and XLNet, score a pair of sentences (A and B) using multiple cross-attention operations - a process in which each word in sentence A attends to all words in sentence B and vice versa. As a result, computing the simi-larity between a query sentence and a set of candidate sen-tences, requires the propagation of all query-candidate sen-tence-pairs throughout a stack of cross-attention layers. This exhaustive process becomes computationally prohibitive when the number of candidate sentences is large. In con-trast, sentence embedding techniques learn a sentence-to-vector mapping and compute the similarity between the sen-tence vectors via simple elementary operations. In this pa-per, we introduce Distilled Sentence Embedding (DSE) - a model that is based on knowledge distillation from cross-attentive models, focusing on sentence-pair tasks. The out-line of DSE is as follows: Given a cross-attentive teacher model (e.g. a fine-tuned BERT), we train a sentence embed-ding based student model to reconstruct the sentence-pair scores obtained by the teacher model. We empirically demonstrate the effectiveness of DSE on five GLUE sen-tence-pair tasks. DSE significantly outperforms several ELMO variants and other sentence embedding methods, while accelerating computation of the query-candidate sen-tence-pairs similarities by several orders of magnitude, with an average relative degradation of 4.6% compared to BERT. Furthermore, we show that DSE produces sentence embed-dings that reach state-of-the-art performance on universal sentence representation benchmarks. Our code is made pub-licly available at https://github.com/microsoft/Distilled-Sentence-Embedding.
Original language | English |
---|---|
Title of host publication | AAAI 2020 - 34th AAAI Conference on Artificial Intelligence |
Publisher | AAAI press |
Pages | 3235-3242 |
Number of pages | 8 |
ISBN (Electronic) | 9781577358350 |
State | Published - 2020 |
Externally published | Yes |
Event | 34th AAAI Conference on Artificial Intelligence, AAAI 2020 - New York, United States Duration: 7 Feb 2020 → 12 Feb 2020 |
Publication series
Name | AAAI 2020 - 34th AAAI Conference on Artificial Intelligence |
---|
Conference
Conference | 34th AAAI Conference on Artificial Intelligence, AAAI 2020 |
---|---|
Country/Territory | United States |
City | New York |
Period | 7/02/20 → 12/02/20 |
Bibliographical note
Publisher Copyright:© 2020, Association for the Advancement of Artificial Intelli-gence.