# Tag: Neural Networks

**The Two Phases of Gradient Descent in Deep Learning**- May 12, 2017.

In short, you reach different resting placing with different SGD algorithms. That is, different SGDs just give you differing convergence rates due to different strategies, but we do expect that they all end up at the same results!**Top 10 Recent AI videos on YouTube**- May 10, 2017.

Top viewed videos on artificial intelligence since 2016 include great talks and lecture series from MIT and Caltech, Google Tech Talks on AI.**Using Deep Learning To Extract Knowledge From Job Descriptions**- May 9, 2017.

We present a deep learning approach to extract knowledge from a large amount of data from the recruitment space. A learning to rank approach is followed to train a convolutional neural network to generate job title and job description embeddings.**Building, Training, and Improving on Existing Recurrent Neural Networks**- May 8, 2017.

In this post, we’ll provide a short tutorial for training a RNN for speech recognition, including code snippets throughout.**Top 10 Machine Learning Videos on YouTube, updated**- May 3, 2017.

The top machine learning videos on YouTube include lecture series from Stanford and Caltech, Google Tech Talks on deep learning, using machine learning to play Mario and Hearthstone, and detecting NHL goals from live streams.**KDnuggets™ News 17:n17, May 3: Learn Machine Learning… in 10 Days?!? Gradient Descent, Simplified**- May 3, 2017.

How to Learn Machine Learning in 10 Days; Keep it simple! How to understand Gradient Descent algorithm; The Guerrilla Guide to Machine Learning with Python; What Data You Analyzed - KDnuggets Poll Results and Trends; Cartoon: Machine Learning - What They Think I Do**Deep Learning – Past, Present, and Future**- May 2, 2017.

There is a lot of buzz around deep learning technology. First developed in the 1940s, deep learning was meant to simulate neural networks found in brains, but in the last decade 3 key developments have unleashed its potential.**One Deep Learning Virtual Machine to Rule Them All**- Apr 28, 2017.

The frontend code of programming languages only needs to parse and translate source code to an intermediate representation (IR). Deep Learning frameworks will eventually need their own “IR.”**How to Build a Recurrent Neural Network in TensorFlow**- Apr 26, 2017.

This is a no-nonsense overview of implementing a recurrent neural network (RNN) in TensorFlow. Both theory and practice are covered concisely, and the end result is running TensorFlow RNN code.**Awesome Deep Learning: Most Cited Deep Learning Papers**- Apr 21, 2017.

This post introduces a curated list of the most cited deep learning papers (since 2012), provides the inclusion criteria, shares a few entry examples, and points to the full listing for those interested in investigating further.**Negative Results on Negative Images: Major Flaw in Deep Learning?**- Apr 20, 2017.

This is an overview of recent research outlining the limitations of the capabilities of image recognition using deep neural networks. But should this really be considered a "limitation?"**Medical Image Analysis with Deep Learning , Part 2**- Apr 13, 2017.

In this article we will talk about basics of deep learning from the lens of Convolutional Neural Nets. We plan to use this knowledge to build CNNs in the next post and use Keras to develop a model to predict lung cancer.**5 Machine Learning Projects You Can No Longer Overlook, April**- Apr 13, 2017.

It's about that time again... 5 more machine learning or machine learning-related projects you may not yet have heard of, but may want to consider checking out. Find tools for data exploration, topic modeling, high-level APIs, and feature selection herein.**Medical Image Analysis with Deep Learning**- Apr 6, 2017.

In this article, I start with basics of image processing, basics of medical image format data and visualize some medical data.**Deep Learning, Generative Adversarial Networks & Boxing – Toward a Fundamental Understanding**- Mar 28, 2017.

In this post we will see why GANs have so much potential, and frame GANs as a boxing match between two opponents.**Cooperative Trust Among Neural Networks Drives Deeper Learning**- Feb 28, 2017.

Machine learning developers need to model a growing range of multi-partner scenarios where many learning agents and data sources interact under varying degrees of trustworthiness. This IBM site helps to take next step towards continuous intelligence.**An Overview of Python Deep Learning Frameworks**- Feb 27, 2017.

Read this concise overview of leading Python deep learning frameworks, including Theano, Lasagne, Blocks, TensorFlow, Keras, MXNet, and PyTorch.**The Anatomy of Deep Learning Frameworks**- Feb 24, 2017.

This post sketches out some common principles which would help you better understand deep learning frameworks, and provides a guide on how to implement your own deep learning framework as well.

**The Gentlest Introduction to Tensorflow – Part 4**- Feb 22, 2017.

This post is the fourth entry in a series dedicated to introducing newcomers to TensorFlow in the gentlest possible manner, and focuses on logistic regression for classifying the digits of 0-9.**The Gentlest Introduction to Tensorflow – Part 3**- Feb 21, 2017.

This post is the third entry in a series dedicated to introducing newcomers to TensorFlow in the gentlest possible manner. This entry progresses to multi-feature linear regression.**Turbo Charge Agile Processes with Deep Learning**- Feb 7, 2017.

The key to leveraging Deep Learning, or more broadly AI, in the workplace is to understand where it fits within an agile development environment.**Top arXiv Papers, January: ConvNets Advances, Wide Instead of Deep, Adversarial Networks Win, Learning to Reinforcement Learn**- Feb 3, 2017.

Check out the top arXiv Papers from January, covering convolutional neural network advances, why wide may trump deep, generative adversarial networks, learning to reinforcement learn, and more.**Deep Learning Research Review: Natural Language Processing**- Jan 31, 2017.

This edition of Deep Learning Research Review explains recent research papers in Natural Language Processing (NLP). If you don't have the time to read the top papers yourself, or need an overview of NLP with Deep Learning, this post is for you.**Creating Curious Machines: Building Information-seeking Agents**- Jan 24, 2017.

Researchers at Maluuba are developing ways to teach artificial agents how to seek information actively, by asking questions. This includes a deep neural agent that learns to accomplish these tasks through efficient information-seeking behaviour, a vital step towards Artificial General Intelligence.**Deep Learning Can be Applied to Natural Language Processing**- Jan 16, 2017.

This post is a rebuttal to a recent article suggesting that neural networks cannot be applied to natural language given that language is not a produced as a result of continuous function. The post delves into some additional points on deep learning as well.**The Major Advancements in Deep Learning in 2016**- Jan 5, 2017.

Get a concise overview of the major advancements observed in deep learning over the past year.**Generative Adversarial Networks – Hot Topic in Machine Learning**- Jan 3, 2017.

What is Generative Adversarial Networks (GAN) ? A very illustrative explanation of GAN is presented here with simple examples like predicting next frame in video sequence or predicting next word while typing in google search.**5 Machine Learning Projects You Can No Longer Overlook, January**- Jan 2, 2017.

There are a lot of popular machine learning projects out there, but many more that are not. Which of these are actively developed and worth checking out? Here is an offering of 5 such projects, the most recent in an ongoing series.

**ResNets, HighwayNets, and DenseNets, Oh My!**- Dec 19, 2016.

This post walks through the logic behind three recent deep learning architectures: ResNet, HighwayNet, and DenseNet. Each make it more possible to successfully trainable deep networks by overcoming the limitations of traditional network design.**Deep Learning Works Great Because the Universe, Physics and the Game of Go are Vastly Simpler than Prior Models and Have Exploitable Patterns**- Dec 16, 2016.

How is Deep Learning experiencing such success solving complex problems? Deep Learning is useful and powerful but it is also that the problems were not as big or as hard as researchers feared when they were unsolved.**KDnuggets™ News 16:n44, Dec 14: Key Data Science 2016 Events, 2017 Trends; Where Data Science was applied; Bayesian Basics**- Dec 14, 2016.

Data Science, Predictive Analytics Main Developments in 2016, Key Trends in 2017; Where Analytics, Data Mining, Data Science were applied in 2016; Bayesian Basics, Explained; Data Science Trends To Look Out For In 2017; Artificial Neural Networks (ANN) Introduction**Artificial Neural Networks (ANN) Introduction, Part 2**- Dec 9, 2016.

Matching the performance of a human brain is a difficult feat, but techniques have been developed to improve the performance of neural network algorithms, 3 of which are discussed in this post: Distortion, mini-batch gradient descent, and dropout.**Artificial Neural Networks (ANN) Introduction, Part 1**- Dec 8, 2016.

This intro to ANNs will look at how we can train an algorithm to recognize images of handwritten digits. We will be using the images from the famous MNIST (Mixed National Institute of Standards and Technology) database.**Top KDnuggets tweets, Nov 30 – Dec 06: A great and useful collection of minimal and clean implementations of #MachineLearning algorithms**- Dec 7, 2016.

Also: #MachineLearning Yearning book draft, Free Download, by Andrew Ng; A short guide to learn #NeuralNets, and maybe get famous and rich with #DeepLearning; Free Book: Foundations of Computer Science, Aho & Ullman.**The hard thing about deep learning**- Dec 1, 2016.

It’s easy to optimize simple neural networks, let’s say single layer perceptron. But, as network becomes deeper, the optmization problem becomes crucial. This article discusses about such optimization problems with deep neural networks.**Deep Learning Reading Group: Skip-Thought Vectors**- Nov 17, 2016.

Skip-thought vectors take inspiration from Word2Vec skip-gram and attempt to extend it to sentences, and are created using an encoder-decoder model. Read on for an overview of the paper.**An Intuitive Explanation of Convolutional Neural Networks**- Nov 11, 2016.

This article provides a easy to understand introduction to what convolutional neural networks are and how they work.**A Quick Introduction to Neural Networks**- Nov 9, 2016.

This article provides a beginner level introduction to multilayer perceptron and backpropagation.**Deep Learning cleans podcast episodes from ‘ahem’ sounds**- Nov 8, 2016.

“3.5 mm audio jack… Ahem!!” where did you hear that? ;) Well, this post is not about Google Pixel vs iPhone 7, but how to remove ugly “Ahem” sound from a speech using deep convolutional neural network. I must say, very interesting read.**Top /r/MachineLearning Posts, October: NSFW Image Recognition, Differentiable Neural Computers, Hinton on Coursera**- Nov 4, 2016.

NSFW Image Recognition, Differentiable Neural Computers, Hinton's Neural Networks for Machine Learning Coursera course; Introducing the AI Open Network; Making a Self-driving RC Car**KDnuggets™ News 16:n38, Oct 26: Free Machine Learning EBooks; Neural Networks in Python with Scikit-learn**- Oct 26, 2016.

5 EBooks to Read Before Getting into A Machine Learning Career; A Beginner's Guide to Neural Networks with Python and Scikit-learn 0.18!; New Poll: What was the largest dataset you analyzed / data mined?; Jupyter Notebook Best Practices for Data Science**A Beginner’s Guide to Neural Networks with Python and SciKit Learn 0.18!**- Oct 20, 2016.

This post outlines setting up a neural network in Python using Scikit-learn, the latest version of which now has built in support for Neural Network models.**KDnuggets™ News 16:n37, Oct 19: Top Data Science Videos; 12 Interesting Big Data Careers; Deep Learning Key Terms**- Oct 19, 2016.

Top 10 Data Science Videos on YouTube; Top 12 Interesting Careers to Explore in Big Data; Deep Learning Key Terms, Explained; Artificial Intelligence, Deep Learning, and Neural Networks, Explained; MLDB: The Machine Learning Database**Artificial Intelligence, Deep Learning, and Neural Networks, Explained**- Oct 14, 2016.

This article is meant to explain the concepts of AI, deep learning, and neural networks at a level that can be understood by most non-practitioners, and can also serve as a reference or review for technical folks as well.**Deep Learning Key Terms, Explained**- Oct 12, 2016.

Gain a beginner's perspective on artificial neural networks and deep learning with this set of 14 straight-to-the-point related key concept definitions, including Biological Neuron, Multilayer Perceptron (MLP), Feedforward Neural Network, and Recurrent Neural Network.**Deep Learning Reading Group: SqueezeNet**- Sep 29, 2016.

This paper introduces a small CNN architecture called “SqueezeNet” that achieves AlexNet-level accuracy on ImageNet with 50x fewer parameters.**Neural Designer: Predictive Analytics Software**- Sep 26, 2016.

Neural Designer advanced neural network algorithms, combined with a simple user interface and fast performance, make it a great tool for data scientists. Download free 15-day trial version.**Up to Speed on Deep Learning: August Update, Part 2**- Sep 23, 2016.

This is the second part of an overview of deep learning stories that made news in August. Look to see if you have missed anything.**Deep Learning Reading Group: Deep Residual Learning for Image Recognition**- Sep 22, 2016.

Published in 2015, today's paper offers a new architecture for Convolution Networks, one which has since become a staple in neural network implementation. Read all about it here.**Top KDnuggets tweets, Sep 14-20: Why we need #DataScience: brain wont let us see 12 black dots at intersection**- Sep 21, 2016.

Why we need #DataScience: brain won't let us see 12 black dots at intersections; #Blurring sensitive info no longer safe! #MachineLearning can recover originals ;Pokemon Go Data; The #NeuralNetwork Zoo - Great chart of different configurations.**Up to Speed on Deep Learning: August Update**- Sep 21, 2016.

Check out this thorough roundup of deep learning stories that made news in August, and see if there are any items of note that you missed.**9 Key Deep Learning Papers, Explained**- Sep 20, 2016.

If you are interested in understanding the current state of deep learning, this post outlines and thoroughly summarizes 9 of the most influential contemporary papers in the field.**Deep Learning Reading Group: Deep Compression**- Sep 15, 2016.

An concise overview of a paper covering three methods of compressing a neural network in order to reduce the size of the network on disk, improve performance, and decrease run time.**Urban Sound Classification with Neural Networks in Tensorflow**- Sep 12, 2016.

This post discuss techniques of feature extraction from sound in Python using open source library Librosa and implements a Neural Network in Tensorflow to categories urban sounds, including car horns, children playing, dogs bark, and more.**Deep Learning Reading Group: Deep Networks with Stochastic Depth**- Sep 8, 2016.

An concise overview of a recent paper which introduces a new way to perturb networks during training in order to improve their performance, stochastic depth networks.**A Beginner’s Guide To Understanding Convolutional Neural Networks Part 2**- Sep 8, 2016.

This is the second part of a thorough introductory treatment of convolutional neural networks. Have a look after reading the first part.**Up to Speed on Deep Learning: July Update, Part 2**- Sep 7, 2016.

Check out this second installation of deep learning stories that made news in July. See if there are any items of note you missed.**KDnuggets™ News 16:n32, Sep 7: Cartoon: Data Scientist was sexiest job until…; Up to Speed on Deep Learning**- Sep 7, 2016.

Cartoon: Data Scientist - the sexiest job of the 21st century until...; Up to Speed on Deep Learning: July Update; How Convolutional Neural Networks Work; Learning from Imbalanced Classes; What is the Role of the Activation Function in a Neural Network?**A Beginner’s Guide To Understanding Convolutional Neural Networks Part 1**- Sep 6, 2016.

Interested in better understanding convolutional neural networks? Check out this first part of a very comprehensive overview of the topic.**Top KDnuggets tweets, Aug 24-30: #DataScientist – sexiest job of the 21st century until …; Activation Function in #NeuralNetworks.**- Aug 31, 2016.

Cartoon: #DataScientist - sexiest job of the 21st century until ...; What is the Role of the Activation Function in Neural Networks?; LinkedIn Machine Learning team tutorial on building #Recommender system; Create a #Chatbot for #Telegram in #Python to Summarize Text.**How Convolutional Neural Networks Work**- Aug 31, 2016.

Get an overview of what is going on inside convolutional neural networks, and what it is that makes them so effective.**What is the Role of the Activation Function in a Neural Network?**- Aug 30, 2016.

Confused as to exactly what the activation function in a neural network does? Read this overview, and check out the handy cheat sheet at the end.**KDnuggets™ News 16:n30, Aug 17: Why Deep Learning Works; Neural Networks with R; Central Limit Theorem for Data Science**- Aug 17, 2016.

3 Thoughts on Why Deep Learning Works So Well; A Beginner's Guide to Neural Networks with R!; Central Limit Theorem for Data Science; Cartoon: Make Data Great Again**Making Data Science Accessible – Neural Networks**- Aug 11, 2016.

This post attempts to make the underlying concepts of neural networks more accessible to everyone. Gain a high level view of their working here.**A Beginner’s Guide to Neural Networks with R!**- Aug 11, 2016.

In this article we will learn how Neural Networks work and how to implement them with the R programming language! We will see how we can easily create Neural Networks with R and even visualize them. Basic understanding of R is necessary to understand this article.**3 Thoughts on Why Deep Learning Works So Well**- Aug 10, 2016.

While answering a posed question in his recent Quora Session, Yann LeCun also shared 3 high-level thoughts on why deep learning works so well.**7 Steps to Understanding Computer Vision**- Aug 9, 2016.

A starting point for Computer Vision and how to get going deeper. Dive into this post for some overview of the right resources and a little bit of advice.**Top KDnuggets tweets, Jul 27 – Aug 2: Understanding neural networks with Google TensorFlow Playground; Getting Started with Data Science in Python**- Aug 3, 2016.

Understanding neural networks with Google TensorFlow Playground; The 100 Best-Funded #Analytics #DataScience #Startups; Great tutorial: Getting Started with #DataScience - #Python; #MachineLearning over 1M hotel reviews: interesting insights.**Why Do Deep Learning Networks Scale?**- Jul 25, 2016.

A discussion of what about deep learning architectures allows them to scale, and addresses some assumptions that often inhibit an understanding of this topic.**Multi-Task Learning in Tensorflow: Part 1**- Jul 20, 2016.

A discussion and step-by-step tutorial on how to use Tensorflow graphs for multi-task learning.**In Deep Learning, Architecture Engineering is the New Feature Engineering**- Jul 19, 2016.

A discussion of architecture engineering in deep neural networks, and its relationship with feature engineering.**How to Start Learning Deep Learning**- Jul 14, 2016.

Want to get started learning deep learning? Sure you do! Check out this great overview, advice, and list of resources.**5 Deep Learning Projects You Can No Longer Overlook**- Jul 12, 2016.

There are a number of "mainstream" deep learning projects out there, but many more niche projects flying under the radar. Have a look at 5 such projects worth checking out.**Deep Residual Networks for Image Classification with Python + NumPy**- Jul 7, 2016.

This post outlines the results of an innovative Deep Residual Network implementation for Image Classification using Python and NumPy.**Three Impactful Machine Learning Topics at ICML 2016**- Jul 1, 2016.

This post discusses 3 particular tutorial sessions of impact from the recent ICML 2016 conference held in New York. Check out some innovative ideas on Deep Residual Networks, Memory Networks for Language Understanding, and Non-Convex Optimization.**Recursive (not Recurrent!) Neural Networks in TensorFlow**- Jun 30, 2016.

Learn how to implement*recursive*neural networks in TensorFlow, which can be used to learn tree-like structures, or directed acyclic graphs.**Peeking Inside Convolutional Neural Networks**- Jun 29, 2016.

This post discusses using some tricks to peek inside of the neural network, and to visualize what the individual units in a layer detect.**Top Machine Learning Libraries for Javascript**- Jun 24, 2016.

Javascript may not be the conventional choice for machine learning, but there is no reason it cannot be used for such tasks. Here are the top libraries to facilitate machine learning in Javascript.**KDnuggets™ News 16:n22, Jun 22: Data Science Blog Contest; Free Machine Learning Ebook; Master SQL for Data Science**- Jun 22, 2016.

Data Science Blog Contest; New Free Andrew Ng Machine Learning Book Under Construction; 7 Steps to Mastering SQL for Data Science; A Visual Explanation of the Back Propagation Algorithm; Mining Twitter Data with Python Part 1: Collecting Data**A Review of Popular Deep Learning Models**- Jun 21, 2016.

This post is a concise overview of a few of the more interesting popular deep learning models to have appeared over the past year. Get up to speed and try a few of the models out for yourself.**A Visual Explanation of the Back Propagation Algorithm for Neural Networks**- Jun 17, 2016.

A concise explanation of backpropagation for neural networks is presented in elementary terms, along with explanatory visualization.**Figuring Out the Algorithms of Intelligence**- Jun 15, 2016.

Marvin Minsky, the father of AI, passed away this year. One of his inventions was the confocal microscope, which we used to take this high-resolution picture of a live brain circuit. Something in these cells allows them to automatically identify useful connections and establish useful networks out of information.**The Truth About Deep Learning**- Jun 6, 2016.

An honest look at deep learning, what it is**not**, its advantages over "shallow" neural networks, and some of the common assumptions and conflations that surround it.**Troubleshooting Neural Networks: What is Wrong When My Error Increases?**- May 13, 2016.

An overview of some of the things that could lead to an increased error rate in neural network implementations.**Deep Learning and Neuromorphic Chips**- May 12, 2016.

The 3 main ingredients to creating artificial intelligence are hardware, software, and data, and while we have focused historically on improving software and data, what if, instead, the hardware was drastically changed?**Implementing Neural Networks in Javascript**- May 12, 2016.

Javascript is one of the most prevalent and fastest growing languages in existence today. Get a quick introduction to implementing neural networks in the language, and direction on where to go from here.**Top Talks and Tutorials From PyData London**- May 11, 2016.

Get some insight into the most recent Python data science talks and presentations with this eclectic mix of videos from PyData London 2016.**How to Quantize Neural Networks with TensorFlow**- May 4, 2016.

The simplest motivation for quantization is to shrink neural network representation by storing the min and max for each layer. Learn more how to perform quantization for deep neural networks.**Deep Learning in Neural Networks: An Overview**- Apr 26, 2016.

This post summarizes Schmidhuber's now-classic (and still relevant) 35 page summary of 900 deep learning papers, giving an overview of the state of deep learning as of 2014. A great introduction to a great paper!**Holding Your Hand Like a Small Child Through a Neural Network – Part 2**- Apr 21, 2016.

The second of 2 posts expanding upon a now-classic neural network blog post and demonstration, guiding the reader through the workings of a simple neural network.**Holding Your Hand Like a Small Child Through a Neural Network – Part 1**- Apr 20, 2016.

The first part of this 2 part series expands upon a now-classic neural network blog post and demonstration, guiding the reader through the foundational building blocks of a simple neural network.**Training a Computer to Recognize Your Handwriting**- Mar 24, 2016.

The remarkable system of neurons is the inspiration behind a widely used machine learning technique called Artificial Neural Networks (ANN), used for image recognition. Learn how you can use this to recognize handwriting.**The ICLR Experiment: Deep Learning Pioneers Take on Scientific Publishing**- Feb 15, 2016.

Deep learning pioneers Yann LeCun and Yoshua Bengio have undertaken a grand experiment in academic publishing. Embracing a radical level of transparency and unprecedented public participation, they've created an opportunity not only to find and vet the best papers, but also to gather data about the publication process itself.**KDnuggets™ News 16:n03, Jan 27: Secret to winning Kaggle; Better Dataviz; Where Analytics is applied**- Jan 27, 2016.

Learning to Code Neural Networks; The secrets to winning Kaggle; 3 Simple Resolutions to Design Better DataViz; Data Scientist - best job in America.**Learning to Code Neural Networks**- Jan 22, 2016.

Learn how to code a neural network, by taking advantage of someone else's experiences learning how to code a neural network.**The Unreasonable Reputation of Neural Networks**- Jan 20, 2016.

A discussion of why deep neural networks are captivating imaginations everywhere, specifically their abilities to model many natural functions well and to learn surprisingly useful representations.**Anthony Goldbloom gives you the Secret to winning Kaggle competitions**- Jan 20, 2016.

Kaggle CEO shares insights on best approaches to win Kaggle competitions, along with a brief explanation of how Kaggle competitions work.**Research Leaders on Data Mining, Data Science and Big Data key advances, top trends**- Jan 18, 2016.

Research Leaders in Data Science and Big Data reflect on the most important research advances in 2015 and the key trends expected to dominate throughout 2016.**What Is Machine Intelligence Vs. Machine Learning Vs. Deep Learning Vs. Artificial Intelligence (AI)?**- Jan 14, 2016.

A discussion of three major approaches to building smart machines - Classic AI, Simple Neural Networks, and Biological Neural Networks - and examples as to how each approach might address the same problem.**KDnuggets™ News 15:n40, Dec 9: 50 useful Machine Learning & Data Science APIs; How Do Neural Nets Learn**- Dec 9, 2015.

50 Useful Machine Learning & Prediction APIs; How do Neural Networks Learn?; Free Data Science Curriculum; Spark + Deep Learning: Distributed Deep Neural Network Training with SparkNet.**Spark + Deep Learning: Distributed Deep Neural Network Training with SparkNet**- Dec 4, 2015.

Training deep neural nets can take precious time and resources. By leveraging an existing distributed batch processing framework, SparkNet can train neural nets quickly and efficiently.**How do Neural Networks Learn?**- Dec 2, 2015.

Neural networks are generating a lot of excitement, while simultaneously posing challenges to people trying to understand how they work. Visualize how neural nets work from the experience of implementing a real world project.**Top KDnuggets tweets, Nov 23-29: One Artificial Neuron Taught to Recognize 100s of Patterns; 5 projects to learn Data Science**- Nov 30, 2015.

Also 5 projects to learn #DataScience; 5 Tribes of #MachineLearning & Master #Algorithm; DataMining photos document 100 years of #smiles.**Amazon Top 20 Books in Neural Networks**- Nov 30, 2015.

These are the most popular neural networks books on Amazon. Perhaps there is something of interest to you here.**Understanding Convolutional Neural Networks for NLP**- Nov 11, 2015.

Dive into the world of Convolution Neural Networks (CNN), learn how they work, how to apply them for NLP, and how to tune CNN hyperparameters for best performance.**KDnuggets™ News 15:n36, Nov 4: Integrating R, Python; Neural Net in 11 lines; Top 20 AI/Machine Learning books**- Nov 4, 2015.

Integrating Python and R; A Neural Network in 11 lines; Amazon Top 20 Books in AI, Machine Learning; How Big Data is used in Recommendation Systems to change to change our lives; Data Science of IoT.**Top /r/MachineLearning Posts, October: Machine learning video course, neural nets evaluate selfies**- Nov 2, 2015.

Machine learning video lectures, deep nets evaluate selfies, Google focusing on machine learning, DeepMind's huge text dataset made available, implement a recurrent neural net, and open source face recognition with Google's FaceNet.**A Neural Network in 11 lines of Python**- Oct 30, 2015.

A bare bones neural network implementation to describe the inner workings of back-propagation.**KDnuggets™ News 15:n33, Oct 14: Is Deep Learning from Devil? Recurrent Neural Networks; Best Blogs**- Oct 14, 2015.

Does Deep Learning Come from the Devil?; Recurrent Neural Networks Tutorial, Introduction; 90+ Active Blogs on Analytics, Big Data, Data Mining, Data Science, Machine Learning; Best Data Science Online Courses.**Recurrent Neural Networks Tutorial, Introduction**- Oct 7, 2015.

Recurrent Neural Networks (RNNs) are popular models that have shown great promise in NLP and many other Machine Learning tasks. Here is a much-needed guide to key RNN models and a few brilliant research papers.**Top /r/MachineLearning Posts, September: Implement a neural network from scratch in C++**- Oct 6, 2015.

Neural network in C++ for beginners, Chinese character handwriting recognition beats humans, a handy machine learning algorithm cheat sheet, neural nets versus functional programming, and a neural nets paper repository.**Recycling Deep Learning Models with Transfer Learning**- Aug 14, 2015.

Deep learning exploits gigantic datasets to produce powerful models. But what can we do when our datasets are comparatively small? Transfer learning by fine-tuning deep nets offers a way to leverage existing datasets to perform well on new tasks.**Top KDnuggets tweets, Aug 04-10: Survival analysis in R – step by step guide**- Aug 11, 2015.

Survival analysis in R - step by step guide; Neural Nets, AI and Deep Learning journey to acceptance; Data is Ugly - Tales of Data Cleaning; Apache Flink and the case for #stream processing #BigData #Analytics.**IAPA Informed Australia roadshow with Data Scientist Patrick Hall**- Aug 4, 2015.

In 3 sessions in Sydney, Melbourne and Canberra (tickets free, but seats are limited), SAS Data Scientist Patrick Hall will provide an inspiring look into his team machine learning research and how it applies to industry.**Lund University Develops an Artificial Neural Network for Matching Heart Transplant Donors with Recipients**- Jul 9, 2015.

Finding the correct donor for the transplant has been challenging and intensively researched usecase in data science. Here, you can find how MathWorks was used to resolve this problem.**Top /r/MachineLearning Posts, June: Neural Network Generated Images, Free Data Science Books, Super Mario World**- Jul 2, 2015.

Generating images with neural networks, free data science books, machine learning for playing Mario, implementing neural networks in Python, and video generation based on terms were all covered this month on /r/MachineLearning.**Why Does Deep Learning Work?**- Jun 23, 2015.

Many researchers recently trying to open the “black-box” of the deep learning. Here we summarize these efforts of how neural nets of deep learning are evolve and how Spin Funnel and deep learning are related.**Top 10 Machine Learning Videos on YouTube**- Jun 23, 2015.

The top machine learning videos on YouTube include lecture series from Stanford and Caltech, Google Tech Talks on deep learning, using machine learning to play Mario and Hearthstone, and detecting NHL goals from live streams.**Top /r/MachineLearning Posts, May: Unreasonable Effectiveness of Recurrent Neural Networks, Time-Lapse Mining**- Jun 1, 2015.

The Unreasonable Effectiveness of Recurrent Neural Networks, Time-lapse mining from Net photos, Deep Learning Textbook Part I, Kaggle R Tutorial, and Free Machine Learning ebooks.**Top KDnuggets tweets, May 19-25: KDnuggets Poll: R leads RapidMiner, Python catches up, Spark ignites; Choosing a Learning Algorithm in Azure ML**- May 26, 2015.

R vs #Python, why each is better; Machine Learning predicts that a fair race between Mo Farah and Usain Bolt is 492m; How Machine Learning Is Eating the #Software World; Handy Guide: Choosing a Learning Algorithm in Azure ML.**Dark Knowledge Distilled from Neural Network**- May 26, 2015.

Geoff Hinton never stopped generating new ideas. This post is a review of his research on “dark knowledge”. What’s that supposed to mean?**KDnuggets™ News 15:n12, Apr 22: Predictive Analytics Future? Top LinkedIn Groups; Preventing Overfitting**- Apr 22, 2015.

New Poll: Future of Predictive Analytics? Top LinkedIn Groups for Analytics, Big Data, Data Mining - "Big Bang" to Now; Preventing Overfitting in Neural Networks; Cloud Machine Learning Wars: Amazon vs IBM Watson vs Microsoft Azure.**Data Science 101: Preventing Overfitting in Neural Networks**- Apr 17, 2015.

Overfitting is a major problem for Predictive Analytics and especially for Neural Networks. Here is an overview of key methods to avoid overfitting, including regularization (L2 and L1), Max norm constraints and Dropout.**KDnuggets™ News 15:n11, Apr 15: Big Data Predictive Analytics Gainers & Losers; Awesome Public Datasets**- Apr 15, 2015.

Awesome Public Datasets on GitHub; Gold Mine or Blind Alley? Functional Programming for Machine Learning; Inside Deep Learning - Convolutional networks; KDnuggets Free Pass to Strata Hadoop World London.**Top /r/MachineLearning Posts, Mar 22-28: Deep Learning flaws & Security, DeepMind Publications, and Keras**- Mar 30, 2015.

Computer Vision security issues, DeepMind, statistics with Python, hacking on neural networks, and Keras, a neural network library are all topics on top of /r/MachineLearning this week.**Talking Machine – 3 Deep Learning Gurus Talk about History and Future of Machine Learning, part 1**- Mar 25, 2015.

An recent interview from the talking machine podcast with three deep learning experts. They talked about the neural network winter and its renewal.**Top KDnuggets tweets, Mar 16-18: 87 Studies shown that accurate numbers aren’t more useful than the ones you make up (Dilbert)**- Mar 19, 2015.

Also Sirius - a free, open-source version of Siri; #PI art: the first 13,689 digits of pi; Great tutorial + #Python code: 1-Layer Neural Networks.**Top /r/MachineLearning Posts, Feb 8-14: Automating Tinder, Statistics and Machine Learning**- Feb 17, 2015.

Automating Tinder with Eigenfaces, statistics lessons in big data analysis, an upcoming AMA, the basics of PCA, and neural network programming in Python are all topics covered in the last week on Reddit.**Facebook Open Sources deep-learning modules for Torch**- Feb 9, 2015.

We review Facebook recently released Torch module for Deep Learning, which helps researchers train large scale convolutional neural networks for image recognition, natural language processing and other AI applications.**Top /r/MachineLearning posts, Jan 25-31**- Feb 6, 2015.

Downsides to jobs in machine learning fields, AI learning materials, novel topic modelling techniques and weekly simple question threads are all topics of discussion this week on Reddit /r/MachineLearning.