- Content-Based Recommendation System using Word Embeddings - Aug 14, 2020.
This article explores how average Word2Vec and TF-IDF Word2Vec can be used to build a recommendation engine.
NLP, Recommendation Engine, Recommender Systems, TF-IDF, Word Embeddings, word2vec
- A Gentle Introduction to Noise Contrastive Estimation - Jul 25, 2019.
Find out how to use randomness to learn your data by using Noise Contrastive Estimation with this guide that works through the particulars of its implementation.
Deep Learning, Logistic Regression, Neural Networks, Noise, Random, Sampling, word2vec
- Word Embeddings in NLP and its Applications - Feb 20, 2019.
Word embeddings such as Word2Vec is a key AI method that bridges the human understanding of language to that of a machine and is essential to solving many NLP problems. Here we discuss applications of Word2Vec to Survey responses, comment analysis, recommendation engines, and more.
Applications, NLP, Recommender Systems, Word Embeddings, word2vec
- Word Embeddings & Self-Supervised Learning, Explained - Jan 16, 2019.
There are many algorithms to learn word embeddings. Here, we consider only one of them: word2vec, and only one version of word2vec called skip-gram, which works well in practice.
Andriy Burkov, NLP, Word Embeddings, word2vec
- How to solve 90% of NLP problems: a step-by-step guide - Jan 14, 2019.
Read this insightful, step-by-step article on how to use machine learning to understand and leverage text.
LIME, NLP, Text Analytics, Text Classification, Word Embeddings, word2vec
- Deep Learning for NLP: An Overview of Recent Trends - Sep 5, 2018.
A new paper discusses some of the recent trends in deep learning based natural language processing (NLP) systems and applications. The focus is on the review and comparison of models and methods that have achieved state-of-the-art (SOTA) results on various NLP tasks and some of the current best practices for applying deep learning in NLP.
Pages: 1 2
Deep Learning, NLP, Word Embeddings, word2vec
- Word Vectors in Natural Language Processing: Global Vectors (GloVe) - Aug 29, 2018.
A well-known model that learns vectors or words from their co-occurrence information is GlobalVectors (GloVe). While word2vec is a predictive model — a feed-forward neural network that learns vectors to improve the predictive ability, GloVe is a count-based model.
NLP, Sciforce, Text Analytics, word2vec
- On the contribution of neural networks and word embeddings in Natural Language Processing - May 31, 2018.
In this post I will try to explain, in a very simplified way, how to apply neural networks and integrate word embeddings in text-based applications, and some of the main implicit benefits of using neural networks and word embeddings in NLP.
Neural Networks, NLP, Word Embeddings, word2vec
- An Introduction to Deep Learning for Tabular Data - May 17, 2018.
This post will discuss a technique that many people don’t even realize is possible: the use of deep learning for tabular data, and in particular, the creation of embeddings for categorical variables.
Deep Learning, fast.ai, Kaggle, Neural Networks, Rachel Thomas, word2vec
- Why Deep Learning is perfect for NLP (Natural Language Processing) - Apr 20, 2018.
Deep learning brings multiple benefits in learning multiple levels of representation of natural language. Here we will cover the motivation of using deep learning and distributed representation for NLP, word embeddings and several methods to perform word embeddings, and applications.
Deep Learning, Neural Networks, NLP, Packt Publishing, word2vec
- Robust Word2Vec Models with Gensim & Applying Word2Vec Features for Machine Learning Tasks - Apr 17, 2018.
The gensim framework, created by Radim Řehůřek consists of a robust, efficient and scalable implementation of the Word2Vec model.
Feature Engineering, NLP, Python, Word Embeddings, word2vec
- Implementing Deep Learning Methods and Feature Engineering for Text Data: The Continuous Bag of Words (CBOW) - Apr 3, 2018.
The CBOW model architecture tries to predict the current target word (the center word) based on the source context words (surrounding words).
Deep Learning, Neural Networks, NLP, word2vec
- Training and Visualising Word Vectors - Jan 23, 2018.
In this tutorial I want to show how you can implement a skip gram model in tensorflow to generate word vectors for any text you are working with and then use tensorboard to visualize them.
Natural Language Processing, Text Mining, Visualization, word2vec
- Beyond Word2Vec Usage For Only Words - Jan 11, 2018.
A good example on how to use word2vec in order to get recommendations fast and efficiently.
Machine Learning, Sports, Star Wars, word2vec
- Cartoon: the distance between Espresso and Cappuccino - Apr 22, 2017.
This cartoon takes a vector space approach to your favorite drinks and examines the distance between Espresso and Cappuccino. Warning: this is only funny to Data Scientists and mathematicians.
Cartoon, Coffee, Humor, word2vec
- Deep Learning Reading Group: Skip-Thought Vectors - Nov 17, 2016.
Skip-thought vectors take inspiration from Word2Vec skip-gram and attempt to extend it to sentences, and are created using an encoder-decoder model. Read on for an overview of the paper.
Deep Learning, Lab41, Natural Language Processing, Neural Networks, word2vec
- The Amazing Power of Word Vectors - May 18, 2016.
A fantastic overview of several now-classic papers on word2vec, the work of Mikolov et al. at Google on efficient vector representations of words, and what you can do with them.
Pages: 1 2
Distributed Representation, NLP, word2vec