Word Embeddings - doraithodla/notes GitHub Wiki
"Word embedding is the way of representing words as vectors. The main goal of word embedding is to redefine those high dimensional word features into low dimensional feature vectors as you preserve that contextual similarity present in the text corpus. Word embedding, or text vectors are commonly used in deep learning models such as recurrent neural networks and convolutional neural networks."