Evaluating Deep Learning Models: The Confusion Matrix, Accuracy, Precision, and Recall - Feb 19, 2021.
This tutorial discusses the confusion matrix, and how the precision, recall and accuracy are calculated, and how they relate to evaluating deep learning models.
Accuracy, Confusion Matrix, Deep Learning, Metrics, Precision, Recall
Popular Machine Learning Interview Questions - Jan 20, 2021.
Get ready for your next job interview requiring domain knowledge in machine learning with answers to these eleven common questions.
Bias, Confusion Matrix, Interview Questions, Machine Learning, Overfitting, Variance
How to Evaluate the Performance of Your Machine Learning Model - Sep 3, 2020.
You can train your supervised machine learning models all day long, but unless you evaluate its performance, you can never know if your model is useful. This detailed discussion reviews the various performance metrics you must consider, and offers intuitive explanations for what they mean and how they work.
Accuracy, Confusion Matrix, Machine Learning, Precision, Predictive Modeling, Recall, ROC-AUC
Model Evaluation Metrics in Machine Learning - May 28, 2020.
A detailed explanation of model evaluation metrics to evaluate a classification machine learning model.
Classification, Confusion Matrix, Machine Learning, Metrics, Python, Regression
- More Performance Evaluation Metrics for Classification Problems You Should Know - Apr 3, 2020.
When building and optimizing your classification model, measuring how accurately it predicts your expected outcome is crucial. However, this metric alone is never the entire story, as it can still offer misleading results. That's where these additional performance evaluations come into play to help tease out more meaning from your model.
Classification, Confusion Matrix, Machine Learning, Metrics, Precision, Recall, ROC-AUC
- Idiot’s Guide to Precision, Recall, and Confusion Matrix - Jan 13, 2020.
Building Machine Learning models is fun, but making sure we build the best ones is what makes a difference. Follow this quick guide to appreciate how to effectively evaluate a classification model, especially for projects where accuracy alone is not enough.
Accuracy, Beginners, Classification, Confusion Matrix, Precision, Predictive Modeling, Recall
- Using Confusion Matrices to Quantify the Cost of Being Wrong - Oct 11, 2018.
The terms ‘true condition’ (‘positive outcome’) and ‘predicted condition’ (‘negative outcome’) are used when discussing Confusion Matrices. This means that you need to understand the differences (and eventually the costs associated) with Type I and Type II Errors.
Confusion Matrix, Data Science, Machine Learning, Metrics, Predictive Modeling