- 20 AI, Data Science, Machine Learning Terms You Need to Know in 2020 (Part 2) - Mar 2, 2020.
We explain important AI, ML, Data Science terms you should know in 2020, including Double Descent, Ethics in AI, Explainability (Explainable AI), Full Stack Data Science, Geospatial, GPT-2, NLG (Natural Language Generation), PyTorch, Reinforcement Learning, and Transformer Architecture.
- Illustrating the Reformer - Feb 12, 2020.
In this post, we will try to dive into the Reformer model and try to understand it with some visual guides.
- Top 10 AI, Machine Learning Research Articles to know - Jan 30, 2020.
We’ve seen many predictions for what new advances are expected in the field of AI and machine learning. Here, we review a “data set” based on what researchers were apparently studying at the turn of the decade to take a fresh glimpse into what might come to pass in 2020.
- The Future of Machine Learning - Jan 17, 2020.
This summary overviews the keynote at TensorFlow World by Jeff Dean, Head of AI at Google, that considered the advancements of computer vision and language models and predicted the direction machine learning model building should follow for the future.
- A Comprehensive Guide to Natural Language Generation - Jan 7, 2020.
Follow this overview of Natural Language Generation covering its applications in theory and practice. The evolution of NLG architecture is also described from simple gap-filling to dynamic document creation along with a summary of the most popular NLG models.
- Research Guide for Transformers - Oct 30, 2019.
The problem with RNNs and CNNs is that they aren’t able to keep up with context and content when sentences are too long. This limitation has been solved by paying attention to the word that is currently being operated on. This guide will focus on how this problem can be addressed by Transformers with the help of deep learning.
- BERT, RoBERTa, DistilBERT, XLNet: Which one to use? - Sep 17, 2019.
Lately, varying improvements over BERT have been shown — and here I will contrast the main similarities and differences so you can choose which one to use in your research or application.
- Deep Learning Next Step: Transformers and Attention Mechanism - Aug 29, 2019.
With the pervasive importance of NLP in so many of today's applications of deep learning, find out how advanced translation techniques can be further enhanced by transformers and attention mechanisms.
- Order Matters: Alibaba’s Transformer-based Recommender System - Aug 23, 2019.
Alibaba, the largest e-commerce platform in China, is a powerhouse not only when it comes to e-commerce, but also when it comes to recommender systems research. Their latest paper, Behaviour Sequence Transformer for E-commerce Recommendation in Alibaba, is yet another publication that pushes the state of the art in recommender systems.
- Adapters: A Compact and Extensible Transfer Learning Method for NLP - Jul 18, 2019.
Adapters obtain comparable results to BERT on several NLP tasks while achieving parameter efficiency.
- Scaling a Massive State-of-the-art Deep Learning Model in Production - Jul 15, 2019.
A new NLP text writing app based on OpenAI's GPT-2 aims to write with you -- whenever you ask. Find out how the developers setup and deployed their model into production from an engineer working on the team.
- Pre-training, Transformers, and Bi-directionality - Jul 12, 2019.
Bidirectional Encoder Representations from Transformers BERT (Devlin et al., 2018) is a language representation model that combines the power of pre-training with the bi-directionality of the Transformer’s encoder (Vaswani et al., 2017). BERT improves the state-of-the-art performance on a wide array of downstream NLP tasks with minimal additional task-specific training.
- Examining the Transformer Architecture – Part 2: A Brief Description of How Transformers Work - Jul 2, 2019.
As The Transformer may become the new NLP standard, this review explores its architecture along with a comparison to existing approaches by RNN.
- Examining the Transformer Architecture: The OpenAI GPT-2 Controversy - Jun 20, 2019.
GPT-2 is a generative model, created by OpenAI, trained on 40GB of Internet to predict the next word. And OpenAI found this model to be SO good that they did not release the fully trained model due to their concerns about malicious applications of the technology.
- Attention Craving RNNS: Building Up To Transformer Networks - Apr 24, 2019.
RNNs let us model sequences in neural networks. While there are other ways of modeling sequences, RNNs are particularly useful. RNNs come in two flavors, LSTMs (Hochreiter et al, 1997) and GRUs (Cho et al, 2014)
Pages: 1 2
- 10 Exciting Ideas of 2018 in NLP - Jan 16, 2019.
We outline a selection of exciting developments in NLP from the last year, and include useful recent papers and images to help further assist with your learning.
- BERT: State of the Art NLP Model, Explained - Dec 26, 2018.
BERT’s key technical innovation is applying the bidirectional training of Transformer, a popular attention model, to language modelling. It has caused a stir in the Machine Learning community by presenting state-of-the-art results in a wide variety of NLP tasks.