- 5 Concepts You Should Know About Gradient Descent and Cost Function - May 7, 2020.
Why is Gradient Descent so important in Machine Learning? Learn more about this iterative optimization algorithm and how it is used to minimize a loss function.
- Choosing an Error Function - Jun 10, 2019.
The error function expresses how much we care about a deviation of a certain size. The choice of error function depends entirely on how our model will be used.
- Understanding Objective Functions in Neural Networks - Nov 23, 2017.
This blog post is targeted towards people who have experience with machine learning, and want to get a better intuition on the different objective functions used to train neural networks.
- Machine Learning Crash Course: Part 1 - May 24, 2017.
This post, the first in a series of ML tutorials, aims to make machine learning accessible to anyone willing to learn. We’ve designed it to give you a solid understanding of how ML algorithms work as well as provide you the knowledge to harness it in your projects.
- Regularization in Logistic Regression: Better Fit and Better Generalization? - Jun 24, 2016.
A discussion on regularization in logistic regression, and how its usage plays into better model fit and generalization.
- A Concise Overview of Standard Model-fitting Methods - May 27, 2016.
A very concise overview of 4 standard model-fitting methods, focusing on their differences: closed-form equations, gradient descent, stochastic gradient descent, and mini-batch learning.
Pages: 1 2
- Interpreting Model Performance with Cost Functions - Jan 13, 2014.
Cost functions are critical for the correct assessment of performance of data mining and predictive models. This series goes deep into the statistical properties and mathematical understanding of each cost function and explores their similarities and differences.