Sentence-BERT

Learn about Sentence-BERT, its fine-tuning architectures, and multiple ways to compute sentence representation.

Sentence-BERT was introduced by the Ubiquitous Knowledge Processing Lab (UKP-TUDA). As the name suggests, Sentence-BERT is used for obtaining fixed-length sentence representations. Sentence-BERT extends the pre-trained BERT model (or its variants) to obtain sentence representation.

Get hands-on with 1400+ tech skills courses.