SGPT: GPT Sentence Embeddings for Semantic Search
Abstract
SGPT, a decoder transformer at 5.8 billion parameters, achieves superior sentence embeddings and semantic search performance compared to larger models.
Decoder transformers have continued increasing in scale reaching hundreds of billions of parameters. Due to their scale the same decoder sets state-of-the-art results on various language tasks via prompting or fine-tuning. Yet, these large foundation models remain unusable for the related fields of semantic search and sentence embeddings. This prevents possibly new state-of-the-art results and forces organizations to train and maintain separate models. To this end, we propose SGPT to use decoders for sentence embeddings and semantic search via prompting or fine-tuning. At 5.8 billion parameters SGPT improves on the previously best sentence embeddings by a margin of 7% and outperforms a concurrent method with 175 billion parameters as measured on the BEIR search benchmark. Code, models and result files are freely available at https://github.com/Muennighoff/sgpt.
Models citing this paper 36
Browse 36 models citing this paperDatasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 23
Collections including this paper 0
No Collection including this paper