How To Compute Word Embeddings with pure Optimization

Word embeddings are one of the most important achievements in Natural Language Processing (NLP) of recent years. Unlike the tinier and tinier percentage improvement achieved on some famous task, that are usually the highlight of many papers, word embeddings are a conceptual achievement. Today, they are so well-known that every more word of mine about them would be quite useless. But just to set the pace: the words embeddings are vectors of real numbers in a very high dimensional space (everywhere from 50 on), such that words that are “related” have similar embeddings. [Read More]

Make the most out of Your MOOC

Have you ever been disappointed with a MOOC? I have been, more than once, even with very high-quality courses. But… why? As someone who has completed more than 50 MOOCs, I learnt that to ask questions is the best way to learn. Then again, why the disappointment? I think the key part is “high-quality” MOOC. My disappointment wasn’t with the quality of the lectures (outstanding), nor with the quality with the courses’ platform (excellent). [Read More]

Real-Time Feedback from Tensorflow Generative Model

This article tells my research that took off after reading some recent research papers on Plug & Play Generative models. Generative Deep Network Models can be powerful: they are trained to take a (small) sequence of data in input and to predict what the next element of the sequence should be. The most famous applications of such models are in Natural Language Processing (NLP). When given a sequence of words, Generative Networks can predict the next word. [Read More]