- A Comprehensive Guide to Ensemble Learning – Exactly What You Need to Know - May 6, 2021.
This article covers ensemble learning methods, and exactly what you need to know in order to understand and implement them.
- Microsoft Explores Three Key Mysteries of Ensemble Learning - Feb 8, 2021.
A new paper studies three key puzzling characteristics of deep learning ensembles and some potential explanations.
- XGBoost: What it is, and when to use it - Dec 23, 2020.
XGBoost is a tree based ensemble machine learning algorithm which is a scalable machine learning system for tree boosting. Read more for an overview of the parameters that make it work, and when you would use the algorithm.
- Implementing the AdaBoost Algorithm From Scratch - Dec 10, 2020.
AdaBoost technique follows a decision tree model with a depth equal to one. AdaBoost is nothing but the forest of stumps rather than trees. AdaBoost works by putting more weight on difficult to classify instances and less on those already handled well. AdaBoost algorithm is developed to solve both classification and regression problem. Learn to build the algorithm from scratch here.
- Simple & Intuitive Ensemble Learning in R - Dec 2, 2020.
Read about metaEnsembleR, an R package for heterogeneous ensemble meta-learning (classification and regression) that is fully-automated.
- How I Consistently Improve My Machine Learning Models From 80% to Over 90% Accuracy - Sep 23, 2020.
Data science work typically requires a big lift near the end to increase the accuracy of any model developed. These five recommendations will help improve your machine learning models and help your projects reach their target goals.
- Making sense of ensemble learning techniques - Mar 26, 2020.
This article breaks down ensemble learning and how it can be used for problem solving.
- Random Forest® — A Powerful Ensemble Learning Algorithm - Jan 22, 2020.
The article explains the Random Forest algorithm and how to build and optimize a Random Forest classifier.
- Explaining Black Box Models: Ensemble and Deep Learning Using LIME and SHAP - Jan 21, 2020.
This article will demonstrate explainability on the decisions made by LightGBM and Keras models in classifying a transaction for fraudulence, using two state of the art open source explainability techniques, LIME and SHAP.
- Introducing Generalized Integrated Gradients (GIG): A Practical Method for Explaining Diverse Ensemble Machine Learning Models - Jan 7, 2020.
There is a need for a new way to explain complex, ensembled ML models for high-stakes applications such as credit and lending. This is why we invented GIG.
- 5 Great New Features in Latest Scikit-learn Release - Dec 10, 2019.
From not sweating missing values, to determining feature importance for any estimator, to support for stacking, and a new plotting API, here are 5 new features of the latest release of Scikit-learn which deserve your attention.
- KDnuggets™ News 19:n35, Sep 18: Which Data Science Skills are core and which are hot/emerging ones?; There is No Free Lunch in Data Science Features - Sep 18, 2019.
Check the results of KDnuggets' latest poll to find out which data science skills are core and which are hot/emerging ones; why is there no free lunch in data science?; training Scikit-learn 100x faster; poking fun at unsupervised machine learning; exploring the case for ensemble learning. All this and much more this week on KDnuggets.
- Many Heads Are Better Than One: The Case For Ensemble Learning - Sep 13, 2019.
While ensembling techniques are notoriously hard to set up, operate, and explain, with the latest modeling, explainability and monitoring tools, they can produce more accurate and stable predictions. And better predictions can be better for business.
- Ensemble Methods for Machine Learning: AdaBoost - Sep 12, 2019.
It turned out that, if we ask the weak algorithm to create a whole bunch of classifiers (all weak for definition), and then combine them all, what may figure out is a stronger classifier.
- 7 Tips for Dealing With Small Data - Jul 29, 2019.
At my workplace, we produce a lot of functional prototypes for our clients. Because of this, I often need to make Small Data go a long way. In this article, I’ll share 7 tips to improve your results when prototyping with small datasets.
- KDnuggets™ News 19:n02, Jan 9: The cold start problem: how to build your machine learning portfolio; 5 Best Data Visualization Libraries - Jan 9, 2019.
Learn how to bootstrap your Machine Learning portfolio, which data visualization libraries to use, main approaches to ensemble learning, how to do text summarization, and check our special offers for leading analytics, AI, and Data Science events below.
- Ensemble Learning: 5 Main Approaches - Jan 3, 2019.
We outline the most popular Ensemble methods including bagging, boosting, stacking, and more.
- GitHub Python Data Science Spotlight: High Level Machine Learning & NLP, Ensembles, Command Line Viz & Docker Made Easy - Oct 16, 2018.
This post spotlights 5 data science projects, all of which are open source and are present on GitHub repositories, focusing on high level machine learning libraries and low level support tools.
- Intuitive Ensemble Learning Guide with Gradient Boosting - Jul 30, 2018.
This tutorial discusses the importance of ensemble learning with gradient boosting as a study case.
- Improving the Performance of a Neural Network - May 30, 2018.
There are many techniques available that could help us achieve that. Follow along to get to know them and to build your own accurate neural network.
- 5 Things to Know About Machine Learning - Mar 7, 2018.
This post will point out 5 thing to know about machine learning, 5 things which you may not know, may not have been aware of, or may have once known and now forgotten.
- KDnuggets™ News 18:n07, Feb 14: 5 Machine Learning Projects You Should Not Overlook; Intro to Python Ensembles - Feb 14, 2018.
5 Machine Learning Projects You Should Not Overlook; Introduction to Python Ensembles; Which Machine Learning Algorithm be used in year 2118?; Fast.ai Lesson 1 on Google Colab (Free GPU)
- Introduction to Python Ensembles - Feb 9, 2018.
In this post, we'll take you through the basics of ensembles — what they are and why they work so well — and provide a hands-on tutorial for building basic ensembles.
Pages: 1 2
- What is the difference between Bagging and Boosting? - Nov 6, 2017.
Bagging and Boosting are both ensemble methods in Machine Learning, but what’s the key behind them? Here we explain in detail.
- Top 10 Machine Learning Algorithms for Beginners - Oct 20, 2017.
A beginner's introduction to the Top 10 Machine Learning (ML) algorithms, complete with figures and examples for easy understanding.
Pages: 1 2
- Random Forests®, Explained - Oct 17, 2017.
Random Forest, one of the most popular and powerful ensemble method used today in Machine Learning. This post is an introduction to such algorithm and provides a brief overview of its inner workings.
- Understanding Machine Learning Algorithms - Oct 3, 2017.
Machine learning algorithms aren’t difficult to grasp if you understand the basic concepts. Here, a SAS data scientist describes the foundations for some of today’s popular algorithms.
- KDnuggets™ News 17:n37, Sep 27: Essential Data Science & Machine Learning Cheat Sheets; 5 Machine Learning Projects to Check Out Now! - Sep 27, 2017.
30 Essential Data Science, Machine Learning & Deep Learning Cheat Sheets; 5 Machine Learning Projects You Can No Longer Overlook - Episode VI; Putting Machine Learning in Production; 5 Ways to Get Started with Reinforcement Learning; Ensemble Learning to Improve Machine Learning Results
- Ensemble Learning to Improve Machine Learning Results - Sep 22, 2017.
Ensemble methods are meta-algorithms that combine several machine learning techniques into one predictive model in order to decrease variance (bagging), bias (boosting), or improve predictions (stacking).
Pages: 1 2
- Data Science Primer: Basic Concepts for Beginners - Aug 11, 2017.
This collection of concise introductory data science tutorials cover topics including the difference between data mining and statistics, supervised vs. unsupervised learning, and the types of patterns we can mine from data.
- Train your Deep Learning model faster and sharper: Snapshot Ensembling — M models for the cost of 1 - Aug 2, 2017.
We explain a novel Snapshot Ensembling method for increasing accuracy of Deep Learning models while also reducing training time.
- Must-Know: What is the idea behind ensemble learning? - May 2, 2017.
In ensemble methods, more diverse the models used, more robust will be the ultimate result.
- 7 More Steps to Mastering Machine Learning With Python - Mar 1, 2017.
This post is a follow-up to last year's introductory Python machine learning post, which includes a series of tutorials for extending your knowledge beyond the original.
Pages: 1 2
- 17 More Must-Know Data Science Interview Questions and Answers, Part 2 - Feb 22, 2017.
The second part of 17 new must-know Data Science Interview questions and answers covers overfitting, ensemble methods, feature selection, ground truth in unsupervised learning, the curse of dimensionality, and parallel algorithms.
- Stacking Models for Improved Predictions - Feb 21, 2017.
This post presents an example of regression model stacking, and proceeds by using XGBoost, Neural Networks, and Support Vector Regression to predict house prices.
- Random Forests® in Python - Dec 2, 2016.
Random forest is a highly versatile machine learning method with numerous applications ranging from marketing to healthcare and insurance. This is a post about random forests using Python.
- Data Science Basics: An Introduction to Ensemble Learners - Nov 8, 2016.
New to classifiers and a bit uncertain of what ensemble learners are, or how different ones work? This post examines 3 of the most popular ensemble methods in an approach designed for newcomers.
- Ensemble Methods: Elegant Techniques to Produce Improved Machine Learning Results - Feb 12, 2016.
Get a handle on ensemble methods from voting and weighting to stacking and boosting, with this well-written overview that includes numerous Python-style pseudocode examples for reinforcement.
Pages: 1 2
- 5 Tribes of Machine Learning – Questions and Answers - Nov 27, 2015.
Leading researcher Pedro Domingos answers questions on 5 tribes of Machine Learning, Master Algorithm, No Free Lunch Theorem, Unsupervised Learning, Ensemble methods, 360-degree recommender, and more.
- Are you trying to acquire Machine Learning Skills? - Sep 16, 2015.
Embarking on a journey through the lands of machine learning? Here are few important lessons like Feature Engineering, Model tuning, Overfitting, Ensembling etc. which you should keep in mind along the way.
- KDnuggets™ News 15:n21, Jul 1: Top 20 R packages; Using Ensembles in Kaggle; Tutorials and How-Tos - Jul 1, 2015.
Top 20 R packages by popularity; Tutorials, Overviews, How-Tos; Open Source Enabled Interactive Analytics; Using Ensembles in Kaggle Data Science Competitions.
- PAW Chicago: Five Unbeatable Analytics Workshops - May 19, 2015.
Take your predictive analytics up a notch with unbeatable PAW Chicago workshops, covering R, predictive modeling, ensemble methods and more.
- Advanced Data Analytics for Business Leaders Explained - Sep 24, 2014.
A business-level explanation of most important data analytics and machine learning methods, including neural networks, deep learning, clustering, ensemble methods, SVM, and when do use what models.
- Top KDnuggets tweets, Aug 4-5: Ensemble Methods, a brief history; Data Scientist role shifting - Aug 6, 2014.
Ensemble Methods are the backbone of #MachineLearning - a brief history; Data Scientist role shifting, with companies focusing on Developers; To add #MachineLearning for Python, scikit-learn; for Hadoop: Mahout; Meet Fortune 2014 #BigData All-Stars: data scientists, entrepreneurs, CEOs.
- Top KDnuggets tweets, Apr 9-10: MLlib: Scalable Machine Learning on Spark; Ensemble methods overview - Apr 11, 2014.
MLlib: Scalable Machine Learning on Spark (free ebook); Ensemble methods usually give best results in Machine Learning - an overview; Prediction.io open source machine learning server ; Maslow Hierarchy of Analytical Needs - too clever?