- Content-Based Recommendation System using Word Embeddings - Aug 14, 2020.
This article explores how average Word2Vec and TF-IDF Word2Vec can be used to build a recommendation engine.
- A Gentle Introduction to Noise Contrastive Estimation - Jul 25, 2019.
Find out how to use randomness to learn your data by using Noise Contrastive Estimation with this guide that works through the particulars of its implementation.
- Extracting Knowledge from Knowledge Graphs Using Facebook’s Pytorch-BigGraph - May 22, 2019.
We are using the state-of-the-art Deep Learning tools to build a model for predict a word using the surrounding words as labels.
Pages: 1 2
- Word Embeddings in NLP and its Applications - Feb 20, 2019.
Word embeddings such as Word2Vec is a key AI method that bridges the human understanding of language to that of a machine and is essential to solving many NLP problems. Here we discuss applications of Word2Vec to Survey responses, comment analysis, recommendation engines, and more.
- Word Embeddings & Self-Supervised Learning, Explained - Jan 16, 2019.
There are many algorithms to learn word embeddings. Here, we consider only one of them: word2vec, and only one version of word2vec called skip-gram, which works well in practice.
- How to solve 90% of NLP problems: a step-by-step guide - Jan 14, 2019.
Read this insightful, step-by-step article on how to use machine learning to understand and leverage text.
- Data Representation for Natural Language Processing Tasks - Nov 2, 2018.
In NLP we must find a way to represent our data (a series of texts) to our systems (e.g. a text classifier). As Yoav Goldberg asks, "How can we encode such categorical data in a way which is amenable for us by a statistical classifier?" Enter the word vector.
- Deep Learning for NLP: An Overview of Recent Trends - Sep 5, 2018.
A new paper discusses some of the recent trends in deep learning based natural language processing (NLP) systems and applications. The focus is on the review and comparison of models and methods that have achieved state-of-the-art (SOTA) results on various NLP tasks and some of the current best practices for applying deep learning in NLP.
Pages: 1 2
- Word Vectors in Natural Language Processing: Global Vectors (GloVe) - Aug 29, 2018.
A well-known model that learns vectors or words from their co-occurrence information is GlobalVectors (GloVe). While word2vec is a predictive model — a feed-forward neural network that learns vectors to improve the predictive ability, GloVe is a count-based model.
- On the contribution of neural networks and word embeddings in Natural Language Processing - May 31, 2018.
In this post I will try to explain, in a very simplified way, how to apply neural networks and integrate word embeddings in text-based applications, and some of the main implicit benefits of using neural networks and word embeddings in NLP.
- An Introduction to Deep Learning for Tabular Data - May 17, 2018.
This post will discuss a technique that many people don’t even realize is possible: the use of deep learning for tabular data, and in particular, the creation of embeddings for categorical variables.
- Why Deep Learning is perfect for NLP (Natural Language Processing) - Apr 20, 2018.
Deep learning brings multiple benefits in learning multiple levels of representation of natural language. Here we will cover the motivation of using deep learning and distributed representation for NLP, word embeddings and several methods to perform word embeddings, and applications.
- Robust Word2Vec Models with Gensim & Applying Word2Vec Features for Machine Learning Tasks - Apr 17, 2018.
The gensim framework, created by Radim Řehůřek consists of a robust, efficient and scalable implementation of the Word2Vec model.
- Implementing Deep Learning Methods and Feature Engineering for Text Data: The Continuous Bag of Words (CBOW) - Apr 3, 2018.
The CBOW model architecture tries to predict the current target word (the center word) based on the source context words (surrounding words).
- Training and Visualising Word Vectors - Jan 23, 2018.
In this tutorial I want to show how you can implement a skip gram model in tensorflow to generate word vectors for any text you are working with and then use tensorboard to visualize them.
- Beyond Word2Vec Usage For Only Words - Jan 11, 2018.
A good example on how to use word2vec in order to get recommendations fast and efficiently.
- How to win Kaggle competition based on NLP task, if you are not an NLP expert - Sep 29, 2017.
Here is how we got one of the best results in a Kaggle challenge remarkable for a number of interesting findings and controversies among the participants.
Pages: 1 2
- Cartoon: the distance between Espresso and Cappuccino - Apr 22, 2017.
This cartoon takes a vector space approach to your favorite drinks and examines the distance between Espresso and Cappuccino. Warning: this is only funny to Data Scientists and mathematicians.
- Deep Learning Reading Group: Skip-Thought Vectors - Nov 17, 2016.
Skip-thought vectors take inspiration from Word2Vec skip-gram and attempt to extend it to sentences, and are created using an encoder-decoder model. Read on for an overview of the paper.
- The Amazing Power of Word Vectors - May 18, 2016.
A fantastic overview of several now-classic papers on word2vec, the work of Mikolov et al. at Google on efficient vector representations of words, and what you can do with them.
Pages: 1 2
- Top KDnuggets tweets, Apr 14-20: Modern Methods for Sentiment Analysis; Basics of SQL, RDBMS – must have skills - Apr 21, 2015.
Great overview: Modern Methods for Sentiment Analysis #word2vec; Basics of SQL and RDBMS - must have skills for data science; The 7 Most Unusual Applications of Big Data; Extensive, but a little confusing site: Understanding Data Visualization.
- Math of Ideas: A Word is Worth a Thousand Vectors - Apr 16, 2015.
Word vectors give us a simple and flexible platform for understanding text, there are a few diverse examples that should help build your confidence in developing and deploying NLP systems and what problems they can solve.
Pages: 1 2 3