Evaluating Deep Learning Models: The Confusion Matrix, Accuracy, Precision, and Recall - Feb 19, 2021.
This tutorial discusses the confusion matrix, and how the precision, recall and accuracy are calculated, and how they relate to evaluating deep learning models.
Accuracy, Confusion Matrix, Deep Learning, Metrics, Precision, Recall
Popular Machine Learning Interview Questions - Jan 20, 2021.
Get ready for your next job interview requiring domain knowledge in machine learning with answers to these eleven common questions.
Bias, Confusion Matrix, Interview Questions, Machine Learning, Overfitting, Variance
How to Evaluate the Performance of Your Machine Learning Model - Sep 3, 2020.
You can train your supervised machine learning models all day long, but unless you evaluate its performance, you can never know if your model is useful. This detailed discussion reviews the various performance metrics you must consider, and offers intuitive explanations for what they mean and how they work.
Accuracy, Confusion Matrix, Machine Learning, Precision, Predictive Modeling, Recall, ROC-AUC
Model Evaluation Metrics in Machine Learning - May 28, 2020.
A detailed explanation of model evaluation metrics to evaluate a classification machine learning model.
Classification, Confusion Matrix, Machine Learning, Metrics, Python, Regression
- Using Confusion Matrices to Quantify the Cost of Being Wrong - Oct 11, 2018.
The terms ‘true condition’ (‘positive outcome’) and ‘predicted condition’ (‘negative outcome’) are used when discussing Confusion Matrices. This means that you need to understand the differences (and eventually the costs associated) with Type I and Type II Errors.
Confusion Matrix, Data Science, Machine Learning, Metrics, Predictive Modeling