- The Decade of Data Science - Jan 27, 2020.
With the last decade being so strong for the emerging field of Data Science, this review considers current trends in the industry, popular frameworks, helpful tools, and new tools that can be leveraged more in the future.
- Fighting Overfitting in Deep Learning - Dec 27, 2019.
This post outlines an attack plan for fighting overfitting in neural networks.
- Lit BERT: NLP Transfer Learning In 3 Steps - Nov 29, 2019.
PyTorch Lightning is a lightweight framework which allows anyone using PyTorch to scale deep learning code easily while making it reproducible. In this tutorial we’ll use Huggingface's implementation of BERT to do a finetuning task in Lightning.
- Transfer Learning Made Easy: Coding a Powerful Technique - Nov 13, 2019.
While the revolution of deep learning now impacts our daily lives, these networks are expensive. Approaches in transfer learning promise to ease this burden by enabling the re-use of trained models -- and this hands-on tutorial will walk you through a transfer learning technique you can run on your laptop.
- The State of Transfer Learning in NLP - Sep 13, 2019.
This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP organized by Matthew Peters, Swabha Swayamdipta, Thomas Wolf, and Sebastian Ruder. This post highlights key insights and takeaways and provides updates based on recent work.
- 7 Tips for Dealing With Small Data - Jul 29, 2019.
At my workplace, we produce a lot of functional prototypes for our clients. Because of this, I often need to make Small Data go a long way. In this article, I’ll share 7 tips to improve your results when prototyping with small datasets.
- KDnuggets™ News 19:n27, Jul 24: Bayesian deep learning and near-term quantum computers; DeepMind’s CASP13 Protein Folding Upset Summary - Jul 24, 2019.
This week on KDnuggets: Learn how DeepMind dominated the last CASP competition for advancing protein folding models; Bayesian deep learning and near-term quantum computers: A cautionary tale in quantum machine learning; The Evolution of a ggplot; Adapters: A Compact and Extensible Transfer Learning Method for NLP; 12 Things I Learned During My First Year as a Machine Learning Engineer; Things I Learned From the SciPy 2019 Lightning Talks; and much more!
- Adapters: A Compact and Extensible Transfer Learning Method for NLP - Jul 18, 2019.
Adapters obtain comparable results to BERT on several NLP tasks while achieving parameter efficiency.
- Examining the Transformer Architecture – Part 2: A Brief Description of How Transformers Work - Jul 2, 2019.
As The Transformer may become the new NLP standard, this review explores its architecture along with a comparison to existing approaches by RNN.
- 5 Ways to Deal with the Lack of Data in Machine Learning - Jun 10, 2019.
Effective solutions exist when you don't have enough data for your models. While there is no perfect approach, five proven ways will get your model to production.
- 7 Steps to Mastering Intermediate Machine Learning with Python — 2019 Edition - Jun 3, 2019.
This is the second part of this new learning path series for mastering machine learning with Python. Check out these 7 steps to help master intermediate machine learning with Python!
- KDnuggets™ News 19:n11, Mar 20: Another 10 Free Must-Read Books for Data Science; 19 Inspiring Women in AI, Big Data, Machine Learning - Mar 20, 2019.
Also: Who is a typical Data Scientist in 2019?; The Pareto Principle for Data Scientists; My favorite mind-blowing Machine Learning/AI breakthroughs; Building NLP Classifiers Cheaply With Transfer Learning and Weak Supervision; Advanced Keras - Accurately Resuming a Training Process
- Building NLP Classifiers Cheaply With Transfer Learning and Weak Supervision - Mar 15, 2019.
In this blog, I’ll walk you through a personal project in which I cheaply built a classifier to detect anti-semitic tweets, with no public dataset available, by combining weak supervision and transfer learning.
Pages: 1 2
- Acquiring Labeled Data to Train Your Models at Low Costs - Feb 27, 2019.
We discuss groundbreaking and unique methods to acquire labeled data at low cost, including 3rd-Party Plug-and-Play AI Model, Zero-Shot Learning, and Restructuring the Existing Data Set.
- State of the art in AI and Machine Learning – highlights of papers with code - Feb 20, 2019.
We introduce papers with code, the free and open resource of state-of-the-art Machine Learning papers, code and evaluation tables.
- KDnuggets™ News 18:n37, Oct 3: Mathematics of Machine Learning; Effective Transfer Learning for NLP; Path Analysis with R - Oct 3, 2018.
Also: Introducing VisualData: A Search Engine for Computer Vision Datasets; Raspberry Pi IoT Projects for Fun and Profit; Recent Advances for a Better Understanding of Deep Learning; Basic Image Data Analysis Using Python - Part 3; Introduction to Deep Learning
- More Effective Transfer Learning for NLP - Oct 1, 2018.
Until recently, the natural language processing community was lacking its ImageNet equivalent — a standardized dataset and training objective to use for training base models.
- Building an Audio Classifier using Deep Neural Networks - Dec 15, 2017.
Using a deep convolutional neural network architecture to classify audio and how to effectively use transfer learning and data-augmentation to improve model accuracy using small datasets.
- The 10 Deep Learning Methods AI Practitioners Need to Apply - Dec 13, 2017.
Deep learning emerged from that decade’s explosive computational growth as a serious contender in the field, winning many important machine learning competitions. The interest has not cooled as of 2017; today, we see deep learning mentioned in every corner of machine learning.
Pages: 1 2
- MLDB: The Machine Learning Database - Oct 17, 2016.
MLDB is an opensource database designed for machine learning. Send it commands over a RESTful API to store data, explore it using SQL, then train machine learning models and expose them as APIs.
- Recycling Deep Learning Models with Transfer Learning - Aug 14, 2015.
Deep learning exploits gigantic datasets to produce powerful models. But what can we do when our datasets are comparatively small? Transfer learning by fine-tuning deep nets offers a way to leverage existing datasets to perform well on new tasks.