- Topic Modeling with BERT - Nov 3, 2020.
Leveraging BERT and TF-IDF to create easily interpretable topics.
Tags: BERT, NLP, TF-IDF, Topic Modeling
- Which flavor of BERT should you use for your QA task? - Oct 22, 2020.
Check out this guide to choosing and benchmarking BERT models for question answering.
Tags: BERT, NLP, Python, Question answering
- Is depth useful for self-attention? - Jul 27, 2020.
Learn about recent research that is the first to explain a surprising phenomenon where in BERT/Transformer-like architectures, deepening the network does not seem to be better than widening (or, increasing the representation dimension). This empirical observation is in contrast to a fundamental premise in deep learning.
Tags: Attention, BERT, Deep Learning, Research, Scalability, Transformer
- Spotting Controversy with NLP - May 21, 2020.
In this article, I’ll introduce you to a hot-topic in financial services and describe how a leading data provider is using data science and NLP to streamline how they find insights in unstructured data.
Tags: BERT, Finance, Fintech, NLP
- Google Unveils TAPAS, a BERT-Based Neural Network for Querying Tables Using Natural Language - May 19, 2020.
The new neural network extends BERT to interact with tabular datasets.
Tags: BERT, Convolutional Neural Networks, Google, NLP
- Why BERT Fails in Commercial Environments - Mar 24, 2020.
The deployment of large transformer-based models in dynamic commercial environments often yields poor results. This is because commercial environments are usually dynamic, and contain continuous domain shifts between inference and training data.
Tags: BERT, Business Value, Failure
- Intent Recognition with BERT using Keras and TensorFlow 2 - Feb 10, 2020.
TL;DR Learn how to fine-tune the BERT model for text classification. Train and evaluate it on a small dataset for detecting seven intents. The results might surprise you!
Tags: BERT, Keras, NLP, Python, TensorFlow
- Lit BERT: NLP Transfer Learning In 3 Steps - Nov 29, 2019.
PyTorch Lightning is a lightweight framework which allows anyone using PyTorch to scale deep learning code easily while making it reproducible. In this tutorial we’ll use Huggingface's implementation of BERT to do a finetuning task in Lightning.
Tags: BERT, NLP, Python, Transfer Learning
- Research Guide for Transformers - Oct 30, 2019.
The problem with RNNs and CNNs is that they aren’t able to keep up with context and content when sentences are too long. This limitation has been solved by paying attention to the word that is currently being operated on. This guide will focus on how this problem can be addressed by Transformers with the help of deep learning.
Tags: BERT, NLP, Research, Transformer, ULMFiT
- BERT, RoBERTa, DistilBERT, XLNet: Which one to use? - Sep 17, 2019.
Lately, varying improvements over BERT have been shown — and here I will contrast the main similarities and differences so you can choose which one to use in your research or application.
Tags: BERT, NLP, Transformer
BERT is changing the NLP landscape - Sep 9, 2019.
BERT is changing the NLP landscape and making chatbots much smarter by enabling computers to better understand speech and respond intelligently in real-time.
Tags: AI, BERT, Chatbot, NLP
- Adapters: A Compact and Extensible Transfer Learning Method for NLP - Jul 18, 2019.
Adapters obtain comparable results to BERT on several NLP tasks while achieving parameter efficiency.
Tags: BERT, NLP, Transfer Learning, Transformer
- Pre-training, Transformers, and Bi-directionality - Jul 12, 2019.
Bidirectional Encoder Representations from Transformers BERT (Devlin et al., 2018) is a language representation model that combines the power of pre-training with the bi-directionality of the Transformer’s encoder (Vaswani et al., 2017). BERT improves the state-of-the-art performance on a wide array of downstream NLP tasks with minimal additional task-specific training.
Tags: AISC, BERT, NLP, Training, Transformer
- Examining the Transformer Architecture – Part 2: A Brief Description of How Transformers Work - Jul 2, 2019.
As The Transformer may become the new NLP standard, this review explores its architecture along with a comparison to existing approaches by RNN.
Tags: BERT, Deep Learning, Exxact, GPU, NLP, Recurrent Neural Networks, Transfer Learning, Transformer
XLNet Outperforms BERT on Several NLP Tasks - Jul 1, 2019.
XLNet is a new pretraining method for NLP that achieves state-of-the-art results on several NLP tasks.
Tags: BERT, NLP, Performance
- Deconstructing BERT, Part 2: Visualizing the Inner Workings of Attention - Mar 6, 2019.
In this post, the author shows how BERT can mimic a Bag-of-Words model. The visualization tool from Part 1 is extended to probe deeper into the mind of BERT, to expose the neurons that give BERT its shape-shifting superpowers.
Tags: Attention, BERT, NLP, Word Embeddings
- Deconstructing BERT: Distilling 6 Patterns from 100 Million Parameters - Feb 27, 2019.
Google’s BERT algorithm has emerged as a sort of “one model to rule them all.” BERT builds on two key ideas that have been responsible for many of the recent advances in NLP: (1) the transformer architecture and (2) unsupervised pre-training.
Tags: Attention, BERT, NLP, Word Embeddings
- KDnuggets™ News 19:n08, Feb 20: The Gold Standard of Python Machine Learning; The Analytics Engineer – new role in the data team - Feb 20, 2019.
Intro to scikit-learn; how to set up a Python ML environment; why there should be a new role in the Data Science team; how to learn one of the hardest parts of being a Data Scientist; and how explainable is BERT?
Tags: BERT, Python, scikit-learn
- Are BERT Features InterBERTible? - Feb 19, 2019.
This is a short analysis of the interpretability of BERT contextual word representations. Does BERT learn a semantic vector representation like Word2Vec?
Tags: BERT, Interpretability, NLP, Word Embeddings
- Artificial Intelligence and Data Science Advances in 2018 and Trends for 2019 - Feb 18, 2019.
We recap some of the major highlights in data science and AI throughout 2018, before looking at the some of the potential newest trends and technological advances for the year ahead.
Pages: 1 2
Tags: 2019 Predictions, AI, AutoML, BERT, Data Science, Interpretability, Predictions, Trends
What were the most significant machine learning/AI advances in 2018? - Jan 22, 2019.
2018 was an exciting year for Machine Learning and AI. We saw “smarter” AI, real-world applications, improvements in underlying algorithms and a greater discussion on the impact of AI on human civilization. In this post, we discuss some of the highlights.
Tags: 2019 Predictions, AI, AlphaZero, BERT, Deep Learning, Machine Learning, NLP, Trends
- 10 Exciting Ideas of 2018 in NLP - Jan 16, 2019.
We outline a selection of exciting developments in NLP from the last year, and include useful recent papers and images to help further assist with your learning.
Tags: BERT, Bias, ICLR, Machine Translation, NLP, Transformer, Unsupervised Learning