Unsupervised creation of interpretable sentence representations

For sentence similarity/document search applications

Ajit Rajasekharan
Towards Data Science
10 min readJul 5, 2020

--

Figure 1. Unsupervised creation of sentence representation signatures for sentence similarity tasks. Illustration uses BERT (bert-large-cased) model.

TL;DR

To date, models learn fixed size representation of sentences, typically with some form of supervision, which are then used for sentence similarity or other downstream tasks. Examples of this are Google’s Universal sentence encoder

--

--