NLP Word Vectors

Roman x Stable Diffusion

Word embeddings such as Word2Vec or fastText are probably the last techology in NLP before the deep learning revolution. Others would argue that they paved the way for DL becoming the NLP standard. May that as it be, it’s a powerfull approach and in this tutorial you will learn about the theory behind these word representations, how they are trained (also from disk). We will also explore approaches to representing text using pretrained vectors.

By the way: Embeddings can also be used non-sequential inputs. Here a cool project where recipes are used to create food-vectors: Food2Vec

We are actually also going to work with food related data…

Notebook

Training WordVectors