Scalable attentive sentence-pair modeling via distilled sentence embedding

Oren Barkan, Noam Razin, Itzik Malkiel, Ori Katz, Avi Caciularu, Noam Koenigstein

פרסום מחקרי: פרק בספר / בדוח / בכנספרסום בספר כנסביקורת עמיתים

תקציר

Recent state-of-the-art natural language understanding mod-els, such as BERT and XLNet, score a pair of sentences (A and B) using multiple cross-attention operations - a process in which each word in sentence A attends to all words in sentence B and vice versa. As a result, computing the simi-larity between a query sentence and a set of candidate sen-tences, requires the propagation of all query-candidate sen-tence-pairs throughout a stack of cross-attention layers. This exhaustive process becomes computationally prohibitive when the number of candidate sentences is large. In con-trast, sentence embedding techniques learn a sentence-to-vector mapping and compute the similarity between the sen-tence vectors via simple elementary operations. In this pa-per, we introduce Distilled Sentence Embedding (DSE) - a model that is based on knowledge distillation from cross-attentive models, focusing on sentence-pair tasks. The out-line of DSE is as follows: Given a cross-attentive teacher model (e.g. a fine-tuned BERT), we train a sentence embed-ding based student model to reconstruct the sentence-pair scores obtained by the teacher model. We empirically demonstrate the effectiveness of DSE on five GLUE sen-tence-pair tasks. DSE significantly outperforms several ELMO variants and other sentence embedding methods, while accelerating computation of the query-candidate sen-tence-pairs similarities by several orders of magnitude, with an average relative degradation of 4.6% compared to BERT. Furthermore, we show that DSE produces sentence embed-dings that reach state-of-the-art performance on universal sentence representation benchmarks. Our code is made pub-licly available at https://github.com/microsoft/Distilled-Sentence-Embedding.

שפה מקוריתאנגלית
כותר פרסום המארחAAAI 2020 - 34th AAAI Conference on Artificial Intelligence
מוציא לאורAAAI press
עמודים3235-3242
מספר עמודים8
מסת"ב (אלקטרוני)9781577358350
סטטוס פרסוםפורסם - 2020
פורסם באופן חיצוניכן
אירוע34th AAAI Conference on Artificial Intelligence, AAAI 2020 - New York, ארצות הברית
משך הזמן: 7 פבר׳ 202012 פבר׳ 2020

סדרות פרסומים

שםAAAI 2020 - 34th AAAI Conference on Artificial Intelligence

כנס

כנס34th AAAI Conference on Artificial Intelligence, AAAI 2020
מדינה/אזורארצות הברית
עירNew York
תקופה7/02/2012/02/20

הערה ביבליוגרפית

Publisher Copyright:
© 2020, Association for the Advancement of Artificial Intelli-gence.

טביעת אצבע

להלן מוצגים תחומי המחקר של הפרסום 'Scalable attentive sentence-pair modeling via distilled sentence embedding'. יחד הם יוצרים טביעת אצבע ייחודית.

פורמט ציטוט ביבליוגרפי