# Tag: Gradient Descent (18)

**Deep Learning in H2O using R**- Jan 22, 2018.

This article is about implementing Deep Learning (DL) using the H2O package in R. We start with a background on DL, followed by some features of H2O's DL framework, followed by an implementation using R.**The 10 Deep Learning Methods AI Practitioners Need to Apply**- Dec 13, 2017.

Deep learning emerged from that decadeâ€™s explosive computational growth as a serious contender in the field, winning many important machine learning competitions. The interest has not cooled as of 2017; today, we see deep learning mentioned in every corner of machine learning.**Understanding Objective Functions in Neural Networks**- Nov 23, 2017.

This blog post is targeted towards people who have experience with machine learning, and want to get a better intuition on the different objective functions used to train neural networks.**Neural Network Foundations, Explained: Updating Weights with Gradient Descent & Backpropagation**- Oct 25, 2017.

In neural networks, connection weights are adjusted in order to help reconcile the differences between the actual and predicted outcomes for subsequent forward passes. But how, exactly, do these weights get adjusted?**How I started with learning AI in the last 2 months**- Oct 9, 2017.

The relevance of a full stack developer will not be enough in the changing scenario of things. In the next two years, full stack will not be full stack without AI skills.**37 Reasons why your Neural Network is not working**- Aug 22, 2017.

Over the course of many debugging sessions, Iâ€™ve compiled my experience along with the best ideas around in this handy list. I hope they would be useful to you.**Train your Deep Learning model faster and sharper: Snapshot Ensemblingâ€Šâ€”â€ŠM models for the cost of 1**- Aug 2, 2017.

We explain a novel Snapshot Ensembling method for increasing accuracy of Deep Learning models while also reducing training time.**Optimization in Machine Learning: Robust or global minimum?**- Jun 30, 2017.

Here we discuss how convex problems are solved and optimised in machine learning/deep learning.**Machine Learning Crash Course: Part 1**- May 24, 2017.

This post, the first in a series of ML tutorials, aims to make machine learning accessible to anyone willing to learn. Weâ€™ve designed it to give you a solid understanding of how ML algorithms work as well as provide you the knowledge to harness it in your projects.**KDnuggets™ News 17:n19, May 17: Guerrilla Guide to Machine Learning with R; 5 Machine Learning Projects You Can’t Overlook**- May 17, 2017.

The Guerrilla Guide to Machine Learning with R; 5 Machine Learning Projects You Can No Longer Overlook, May; The Two Phases of Gradient Descent in Deep Learning; HDFS vs. HBase: All you need to know; Must-Know: What are common data quality issues for Big Data and how to handle them?**Top /r/MachineLearning Posts, April: Why Momentum Really Works; Machine Learning with Scikit-Learn & TensorFlow**- May 5, 2017.

Why Momentum Really Works; O'Reilly's Hands-On Machine Learning with Scikit-Learn and TensorFlow; Implemented BEGAN and saw a cute face at iteration 168k; Self-driving car course; Exploring the mysteries of Go; DeepMind Solves AGI**KDnuggets™ News 17:n17, May 3: Learn Machine Learning… in 10 Days?!? Gradient Descent, Simplified**- May 3, 2017.

How to Learn Machine Learning in 10 Days; Keep it simple! How to understand Gradient Descent algorithm; The Guerrilla Guide to Machine Learning with Python; What Data You Analyzed - KDnuggets Poll Results and Trends; Cartoon: Machine Learning - What They Think I Do**Keep it simple! How to understand Gradient Descent algorithm**- Apr 28, 2017.

In Data Science, Gradient Descent is one of the important and difficult concepts. Here we explain this concept with an example, in a very simple way. Check this out.**Learning to Learn by Gradient Descent by Gradient Descent**- Feb 2, 2017.

What if instead of hand designing an optimising algorithm (function) we*learn*it instead? That way, by training on the class of problems weâ€™re interested in solving, we can learn an optimum optimiser for the class!**The Gentlest Introduction to Tensorflow â€“ Part 2**- Aug 19, 2016.

Check out the second and final part of this introductory tutorial to TensorFlow.**The Gentlest Introduction to Tensorflow – Part 1**- Aug 17, 2016.

In this series of articles, we present the gentlest introduction to Tensorflow that starts off by showing how to do linear regression for a single feature problem, and expand from there.**A Concise Overview of Standard Model-fitting Methods**- May 27, 2016.

A very concise overview of 4 standard model-fitting methods, focusing on their differences: closed-form equations, gradient descent, stochastic gradient descent, and mini-batch learning.**All Machine Learning Models Have Flaws**- Mar 3, 2015.

This classic post examines what is right and wrong with different models of machine learning, including Bayesian learning, Graphical Models, Convex Loss Optimization, Statistical Learning, and more.