Title Extracting sentence embeddings from pretrained transformer models /
Authors Stankevičius, Lukas ; Lukoševičius, Mantas
DOI 10.3390/app14198887
Full Text Download
Is Part of Applied sciences.. Basel : MDPI. 2024, vol. 14, iss. 19, art. no. 8887, p. 1-66.. ISSN 2076-3417
Keywords [eng] BERT ; embeddings ; large language models ; natural language processing ; prompt engineering ; semantic similarity ; sentence vector representation ; text embeddings ; transformer models ; unsupervised learning
Abstract [eng] Pre-trained transformer models shine in many natural language processing tasks and therefore are expected to bear the representation of the input sentence or text meaning. These sentence-level embeddings are also important in retrieval-augmented generation. But do commonly used plain averaging or prompt templates sufficiently capture and represent the underlying meaning? After providing a comprehensive review of existing sentence embedding extraction and refinement methods, we thoroughly test different combinations and our original extensions of the most promising ones on pretrained models. Namely, given 110 M parameters, BERT’s hidden representations from multiple layers, and many tokens, we try diverse ways to extract optimal sentence embeddings. We test various token aggregation and representation post-processing techniques. We also test multiple ways of using a general Wikitext dataset to complement BERT’s sentence embeddings. All methods are tested on eight Semantic Textual Similarity (STS), six short text clustering, and twelve classification tasks. We also evaluate our representation-shaping techniques on other static models, including random token representations. Proposed representation extraction methods improve the performance on STS and clustering tasks for all models considered. Very high improvements for static token-based models, especially random embeddings for STS tasks, almost reach the performance of BERT-derived representations. Our work shows that the representation-shaping techniques significantly improve sentence embeddings extracted from BERT-based and simple baseline models.
Published Basel : MDPI
Type Journal article
Language English
Publication date 2024
CC license CC license description