- Machine Learning – it’s all about assumptions - Feb 11, 2021.
Just as with most things in life, assumptions can directly lead to success or failure. Similarly in machine learning, appreciating the assumed logic behind machine learning techniques will guide you toward applying the best tool for the data.
Algorithms, Decision Trees, K-nearest neighbors, Linear Regression, Logistic Regression, Machine Learning, Naive Bayes, SVM, XGBoost
All Machine Learning Algorithms You Should Know in 2021 - Jan 4, 2021.
Many machine learning algorithms exits that range from simple to complex in their approach, and together provide a powerful library of tools for analyzing and predicting patterns from data. If you are learning for the first time or reviewing techniques, then these intuitive explanations of the most popular machine learning models will help you kick off the new year with confidence.
Algorithms, Decision Trees, Explained, Gradient Boosting, K-nearest neighbors, Machine Learning, Naive Bayes, Regression, SVM
- Most Popular Distance Metrics Used in KNN and When to Use Them - Nov 11, 2020.
For calculating distances KNN uses a distance metric from the list of available metrics. Read this article for an overview of these metrics, and when they should be considered for use.
K-nearest neighbors, Metrics, scikit-learn
- Doing the impossible? Machine learning with less than one example - Nov 9, 2020.
Machine learning algorithms are notoriously known for needing data, a lot of data -- the more data the better. But, much research has gone into developing new methods that need fewer examples to train a model, such as "few-shot" or "one-shot" learning that require only a handful or a few as one example for effective learning. Now, this lower boundary on training examples is being taken to the next extreme.
Algorithms, K-nearest neighbors, Machine Learning, Research
How to Explain Key Machine Learning Algorithms at an Interview - Oct 19, 2020.
While preparing for interviews in Data Science, it is essential to clearly understand a range of machine learning models -- with a concise explanation for each at the ready. Here, we summarize various machine learning models by highlighting the main points to help you communicate complex models.
Algorithms, Decision Trees, Interview Questions, K-nearest neighbors, Machine Learning, Naive Bayes, Regression, SVM
- Exploring The Brute Force K-Nearest Neighbors Algorithm - Oct 12, 2020.
This article discusses a simple approach to increasing the accuracy of k-nearest neighbors models in a particular subset of cases.
Algorithms, K-nearest neighbors, Machine Learning, Python
- Introduction to the K-nearest Neighbour Algorithm Using Examples - Apr 1, 2020.
Read this concise summary of KNN, a supervised and pattern classification learning algorithm which helps us find which class the new input belongs to when k nearest neighbours are chosen and distance is calculated between them.
Algorithms, K-nearest neighbors, Machine Learning, Python, scikit-learn
- Beginner’s Guide to K-Nearest Neighbors in R: from Zero to Hero - Jan 3, 2020.
This post presents a pipeline of building a KNN model in R with various measurement metrics.
Beginners, K-nearest neighbors, Metrics, R
- 5 Great New Features in Latest Scikit-learn Release - Dec 10, 2019.
From not sweating missing values, to determining feature importance for any estimator, to support for stacking, and a new plotting API, here are 5 new features of the latest release of Scikit-learn which deserve your attention.
Data Preparation, Data Preprocessing, Ensemble Methods, Feature Selection, Gradient Boosting, K-nearest neighbors, Machine Learning, Missing Values, Python, scikit-learn, Visualization
- Classifying Heart Disease Using K-Nearest Neighbors - Jul 8, 2019.
I have written this post for the developers and assumes no background in statistics or mathematics. The focus is mainly on how the k-NN algorithm works and how to use it for predictive modeling problems.
Pages: 1 2
Healthcare, K-nearest neighbors, Machine Learning, Medical, Python
7 Steps to Mastering Intermediate Machine Learning with Python — 2019 Edition - Jun 3, 2019.
This is the second part of this new learning path series for mastering machine learning with Python. Check out these 7 steps to help master intermediate machine learning with Python!
7 Steps, Classification, Cross-validation, Dimensionality Reduction, Feature Engineering, Feature Selection, Image Classification, K-nearest neighbors, Machine Learning, Modeling, Naive Bayes, numpy, Pandas, PCA, Python, scikit-learn, Transfer Learning

Journey to Machine Learning – 100 Days of ML Code - Sep 7, 2018.
A personal account from Machine Learning enthusiast Avik Jain on his experiences of #100DaysOfMLCode, a challenge that encourages beginners to code and study machine learning for at least an hour, every day for 100 days.
GitHub, K-nearest neighbors, Machine Learning, Python, SVM
- Introduction to k-Nearest Neighbors - Mar 22, 2018.
What is k-Nearest-Neighbors (kNN), some useful applications, and how it works.
K-nearest neighbors, Machine Learning
Introduction to Optimization with Genetic Algorithm - Mar 14, 2018.
This article gives a brief introduction about evolutionary algorithms (EAs) and describes genetic algorithm (GA) which is one of the simplest random-based EAs.
Genetic Algorithm, K-nearest neighbors, Optimization
- Top 10 Machine Learning with R Videos - Oct 24, 2017.
A complete video guide to Machine Learning in R! This great compilation of tutorials and lectures is an amazing recipe to start developing your own Machine Learning projects.
Algorithms, Clustering, K-nearest neighbors, Machine Learning, PCA, R, Text Mining, Top 10, Youtube
Top 10 Machine Learning Algorithms for Beginners - Oct 20, 2017.
A beginner's introduction to the Top 10 Machine Learning (ML) algorithms, complete with figures and examples for easy understanding.
Pages: 1 2
Adaboost, Algorithms, Apriori, Bagging, Beginners, Boosting, Decision Trees, Ensemble Methods, Explained, K-means, K-nearest neighbors, Linear Regression, Logistic Regression, Machine Learning, Naive Bayes, PCA, Top 10
- K-Nearest Neighbors – the Laziest Machine Learning Technique - Sep 12, 2017.
K-Nearest Neighbors (K-NN) is one of the simplest machine learning algorithms. When a new situation occurs, it scans through all past experiences and looks up the k closest experiences. Those experiences (or: data points) are what we call the k nearest neighbors.
Algorithms, K-nearest neighbors, Machine Learning, RapidMiner
- Neighbors Know Best: (Re) Classifying an Underappreciated Beer - Nov 24, 2016.
A look at beer features to determine whether a specific brew might be better served (pun intended) by being classified under a different style. kNN analysis supported with in-post plots and linked iPython notebook.
Beer, Classification, Data Visualization, K-nearest neighbors, Python
- The Great Algorithm Tutorial Roundup - Sep 20, 2016.
This is a collection of tutorials relating to the results of the recent KDnuggets algorithms poll. If you are interested in learning or brushing up on the most used algorithms, as per our readers, look here for suggestions on doing so!
Algorithms, Clustering, Decision Trees, K-nearest neighbors, Machine Learning, PCA, Poll, random forests algorithm, Regression, Statistics, Text Mining, Time Series, Visualization
- Implementing Your Own k-Nearest Neighbor Algorithm Using Python - Jan 27, 2016.
A detailed explanation of one of the most used machine learning algorithms, k-Nearest Neighbors, and its implementation from scratch in Python. Enhance your algorithmic understanding with this hands-on coding exercise.
Pages: 1 2 3
K-nearest neighbors, Python, Python Tutorial
- Top 10 Data Mining Algorithms, Explained - May 21, 2015.
Top 10 data mining algorithms, selected by top researchers, are explained here, including what do they do, the intuition behind the algorithm, available implementations of the algorithms, why use them, and interesting applications.
Pages: 1 2 3
Algorithms, Apriori, Bayesian, Boosting, C4.5, CART, Data Mining, Explained, K-means, K-nearest neighbors, Naive Bayes, Page Rank, Support Vector Machines, Top 10
- Do We Need More Training Data or More Complex Models? - Mar 23, 2015.
Do we need more training data? Which models will suffer from performance saturation as data grows large? Do we need larger models or more complicated models, and what is the difference?
Big Data, convnet, Generalized Linear Models, K-nearest neighbors, Training Data, Zachary Lipton