- Feature Reduction using Genetic Algorithm with Python - Mar 25, 2019.
This tutorial discusses how to use the genetic algorithm (GA) for reducing the feature vector extracted from the Fruits360 dataset in Python mainly using NumPy and Sklearn.
Pages: 1 2
- Checklist for Debugging Neural Networks - Mar 22, 2019.
Check out these tangible steps you can take to identify and fix issues with training, generalization, and optimization for machine learning models.
- Artificial Neural Networks Optimization using Genetic Algorithm with Python - Mar 18, 2019.
This tutorial explains the usage of the genetic algorithm for optimizing the network weights of an Artificial Neural Network for improved performance.
Pages: 1 2
- Advanced Keras — Accurately Resuming a Training Process - Mar 14, 2019.
This article on practical advanced Keras use covers handling nontrivial cases where custom callbacks are used.
- AI: Arms Race 2.0 - Mar 12, 2019.
An analysis of the current state of the competition between US, Europe, and China in AI, examining research, patent publications, global datasphere, devices and IoT, people, and more.
- Breaking neural networks with adversarial attacks - Mar 7, 2019.
We develop an intuition behind "adversarial attacks" on deep neural networks, and understand why these attacks are so successful.
- Neural Networks with Numpy for Absolute Beginners — Part 2: Linear Regression - Mar 7, 2019.
In this tutorial, you will learn to implement Linear Regression for prediction using Numpy in detail and also visualize how the algorithm learns epoch by epoch. In addition to this, you will explore two layer Neural Networks.
Pages: 1 2
- Neural Networks seem to follow a puzzlingly simple strategy to classify images - Mar 5, 2019.
We explain why state-of-the-art Deep Neural Networks can still recognize scrambled images perfectly well and how this helps to uncover a puzzlingly simple strategy that DNNs seem to use to classify natural images.
- Neural Networks with Numpy for Absolute Beginners: Introduction - Mar 5, 2019.
In this tutorial, you will get a brief understanding of what Neural Networks are and how they have been developed. In the end, you will gain a brief intuition as to how the network learns.
- Comparing MobileNet Models in TensorFlow - Mar 1, 2019.
MobileNets are a family of mobile-first computer vision models for TensorFlow, designed to effectively maximize accuracy while being mindful of the restricted resources for an on-device or embedded application.
- TensorFlow.js: Machine learning for the web and beyond - Feb 28, 2019.
- How to do Everything in Computer Vision - Feb 27, 2019.
The many standard tasks in computer vision all require special consideration: classification, detection, segmentation, pose estimation, enhancement and restoration, and action recognition. Let me show you how to do everything in Computer Vision with Deep Learning!
- Artificial Neural Network Implementation using NumPy and Image Classification - Feb 21, 2019.
This tutorial builds artificial neural network in Python using NumPy from scratch in order to do an image classification application for the Fruits360 dataset
Pages: 1 2
- Deep Multi-Task Learning – 3 Lessons Learned - Feb 15, 2019.
We share specific points to consider when implementing multi-task learning in a Neural Network (NN) and present TensorFlow solutions to these issues.
- A comprehensive survey on graph neural networks - Feb 15, 2019.
This article summarizes a paper which presents us with a broad sweep of the graph neural network landscape. It’s a survey paper, so you’ll find details on the key approaches and representative papers, as well as information on commonly used datasets and benchmark performance on them.
- Neural Networks – an Intuition - Feb 7, 2019.
Neural networks are one of the most powerful algorithms used in the field of machine learning and artificial intelligence. We attempt to outline its similarities with the human brain and how intuition plays a big part in this.
- NLP Overview: Modern Deep Learning Techniques Applied to Natural Language Processing - Jan 8, 2019.
Trying to keep up with advancements at the overlap of neural networks and natural language processing can be troublesome. That's where the today's spotlighted resource comes in.
- The Backpropagation Algorithm Demystified - Jan 2, 2019.
A crucial aspect of machine learning is its ability to recognize error margins and to interpret data more precisely as rising numbers of datasets are fed through its neural network. Commonly referred to as backpropagation, it is a process that isn’t as complex as you might think.
- Supervised Learning: Model Popularity from Past to Present - Dec 28, 2018.
An extensive look at the history of machine learning models, using historical data from the number of publications of each type to attempt to answer the question: what is the most popular model?
- BERT: State of the Art NLP Model, Explained - Dec 26, 2018.
BERT’s key technical innovation is applying the bidirectional training of Transformer, a popular attention model, to language modelling. It has caused a stir in the Machine Learning community by presenting state-of-the-art results in a wide variety of NLP tasks.
- The brain as a neural network: this is why we can’t get along - Dec 19, 2018.
This article sets out to answer the question: what insights can we gain about ourselves by thinking of the brain as a machine learning model?
- Deep Learning Cheat Sheets - Nov 28, 2018.
Check out this collection of high-quality deep learning cheat sheets, filled with valuable, concise information on a variety of neural network-related topics.
- Using a Keras Long Short-Term Memory (LSTM) Model to Predict Stock Prices - Nov 21, 2018.
LSTMs are very powerful in sequence prediction problems because they’re able to store past information. This is important in our case because the previous price of a stock is crucial in predicting its future price.
- Introduction to PyTorch for Deep Learning - Nov 7, 2018.
In this tutorial, you’ll get an introduction to deep learning using the PyTorch framework, and by its conclusion, you’ll be comfortable applying it to your deep learning models.
- Mastering the Learning Rate to Speed Up Deep Learning - Nov 6, 2018.
Figuring out the optimal set of hyperparameters can be one of the most time consuming portions of creating a machine learning model, and that’s particularly true in deep learning.
- Introduction to Deep Learning with Keras - Oct 29, 2018.
In this article, we’ll build a simple neural network using Keras. Now let’s proceed to solve a real business problem: an insurance company wants you to develop a model to help them predict which claims look fraudulent.
Pages: 1 2
- Generative Adversarial Networks – Paper Reading Road Map - Oct 24, 2018.
To help the others who want to learn more about the technical sides of GANs, I wanted to share some papers I have read in the order that I read them.
- The Main Approaches to Natural Language Processing Tasks - Oct 17, 2018.
Let's have a look at the main approaches to NLP tasks that we have at our disposal. We will then have a look at the concrete NLP tasks we can tackle with said approaches.
- [Webinar] Neural Network Fundamentals - Oct 16, 2018.
In this webinar, Oct 25, 2018, 10:00 am PST, we will apply your convolutional neural network using the ImageNet scenario. We will also review some of the ImageNet architectures and how convolutions work.
- Sequence Modeling with Neural Networks – Part I - Oct 3, 2018.
In the context of this post, we will focus on modeling sequences as a well-known data structure and will study its specific learning framework.
- How to Create a Simple Neural Network in Python - Oct 2, 2018.
The best way to understand how neural networks work is to create one yourself. This article will demonstrate how to do just that.
- More Effective Transfer Learning for NLP - Oct 1, 2018.
Until recently, the natural language processing community was lacking its ImageNet equivalent — a standardized dataset and training objective to use for training base models.
- Introduction to Deep Learning - Sep 28, 2018.
I decided to begin to put some structure in my understanding of Neural Networks through this series of articles.
- Power Laws in Deep Learning 2: Universality - Sep 26, 2018.
It is amazing that Deep Neural Networks display this Universality in their weight matrices, and this suggests some deeper reason for Why Deep Learning Works.
- 6 Steps To Write Any Machine Learning Algorithm From Scratch: Perceptron Case Study - Sep 20, 2018.
Writing a machine learning algorithm from scratch is an extremely rewarding learning experience. We highlight 6 steps in this process.
- Power Laws in Deep Learning - Sep 20, 2018.
In pretrained, production quality DNNs, the weight matrices for the Fully Connected (FC ) layers display Fat Tailed Power Law behavior.
- Data Augmentation For Bounding Boxes: Rethinking image transforms for object detection - Sep 19, 2018.
Data Augmentation is one way to battle this shortage of data, by artificially augmenting our dataset. In fact, the technique has proven to be so successful that it's become a staple of deep learning systems.
Pages: 1 2
- Everything You Need to Know About AutoML and Neural Architecture Search - Sep 13, 2018.
So how does it work? How do you use it? What options do you have to harness that power today? Here’s everything you need to know about AutoML and NAS.
- Machine Learning Cheat Sheets - Sep 11, 2018.
Check out this collection of machine learning concept cheat sheets based on Stanord CS 229 material, including supervised and unsupervised learning, neural networks, tips & tricks, probability & stats, and algebra & calculus.
- Training with Keras-MXNet on Amazon SageMaker - Sep 10, 2018.
In this post, you will learn how to train Keras-MXNet jobs on Amazon SageMaker. I’ll show you how to build custom Docker containers for CPU and GPU training, configure multi-GPU training, pass parameters to a Keras script, and save the trained models in Keras and MXNet formats.
Pages: 1 2
- Neural Networks and Deep Learning: A Textbook - Sep 7, 2018.
This book covers both classical and modern models in deep learning. The book is intended to be a textbook for universities, and it covers the theoretical and algorithmic aspects of deep learning.
- AI Knowledge Map: How To Classify AI Technologies - Aug 31, 2018.
What follows is then an effort to draw an architecture to access knowledge on AI and follow emergent dynamics, a gateway of pre-existing knowledge on the topic that will allow you to scout around for additional information and eventually create new knowledge on AI.
- Auto-Keras, or How You can Create a Deep Learning Model in 4 Lines of Code - Aug 17, 2018.
Auto-Keras is an open source software library for automated machine learning. Auto-Keras provides functions to automatically search for architecture and hyperparameters of deep learning models.
- AutoKeras: The Killer of Google’s AutoML - Aug 15, 2018.
Auto-Keras is an open source "competitor" to Google’s AutoML, a new cloud software suite of Machine Learning tools. It’s based on Google’s state-of-the-art research in Neural Architecture Search (NAS).
- Only Numpy: Implementing GANs and Adam Optimizer using Numpy - Aug 6, 2018.
This post is an implementation of GANs and the Adam optimizer using only Python and Numpy, with minimal focus on the underlying maths involved.
- Best (and Free!!) Resources to Understand Nuts and Bolts of Deep Learning - Jul 19, 2018.
This blog is however not addressing the absolute beginner. Once you have a bit of intuition about how Deep Learning algorithms work, you might want to understand how things work below the hood.
- Beginners Ask “How Many Hidden Layers/Neurons to Use in Artificial Neural Networks?” - Jul 16, 2018.
By the end of this article, you could at least get the idea of how these questions are answered and be able to test yourself based on simple examples.
- AI Solutionism - Jul 12, 2018.
Machine learning has huge potential for the future of humanity — but it won’t solve all our problems.
- Deep Learning Tips and Tricks - Jul 4, 2018.
This post is a distilled collection of conversations, messages, and debates on how to optimize deep models. If you have tricks you’ve found impactful, please share them in the comments below!
- Deep Quantile Regression - Jul 3, 2018.
Most Deep Learning frameworks currently focus on giving a best estimate as defined by a loss function. Occasionally something beyond a point estimate is required to make a decision. This is where a distribution would be useful. This article will purely focus on inferring quantiles.
- Inside the Mind of a Neural Network with Interactive Code in Tensorflow - Jun 29, 2018.
Understand the inner workings of neural network models as this post covers three related topics: histogram of weights, visualizing the activation of neurons, and interior / integral gradients.
Pages: 1 2
- Building a Basic Keras Neural Network Sequential Model - Jun 29, 2018.
The approach basically coincides with Chollet's Keras 4 step workflow, which he outlines in his book "Deep Learning with Python," using the MNIST dataset, and the model built is a Sequential network of Dense layers. A building block for additional posts.
- Using Topological Data Analysis to Understand the Behavior of Convolutional Neural Networks - Jun 28, 2018.
Neural Networks are powerful but complex and opaque tools. Using Topological Data Analysis, we can describe the functioning and learning of a convolutional neural network in a compact and understandable way.
- KDnuggets™ News 18:n25, Jun 27: 5 Clustering Algorithms Data Scientists Need to Know; Detecting Sarcasm with Deep Convolutional Neural Networks? - Jun 27, 2018.
Also 30 Free Resources for Machine Learning, Deep Learning, NLP ; 7 Simple Data Visualizations You Should Know in R.
- Batch Normalization in Neural Networks - Jun 26, 2018.
This article explains batch normalization in a simple way. I wrote this article after what I learned from Fast.ai and deeplearning.ai.
- Deep Learning Best Practices – Weight Initialization - Jun 21, 2018.
In this blog I am going to talk about the issues related to initialization of weight matrices and ways to mitigate them. Before that, let’s just cover some basics and notations that we will be using going forward.
- Taming LSTMs: Variable-sized mini-batches and why PyTorch is good for your health - Jun 14, 2018.
After reading this, you’ll be back to fantasies of you + PyTorch eloping into the sunset while your Recurrent Networks achieve new accuracies you’ve only read about on Arxiv.
- How To Create Natural Language Semantic Search For Arbitrary Objects With Deep Learning - Jun 13, 2018.
An end-to-end example of how to build a system that can search objects semantically.
Pages: 1 2
- DIY Deep Learning Projects - Jun 8, 2018.
Inspired by the great work of Akshay Bahadur in this article you will see some projects applying Computer Vision and Deep Learning, with implementations and details so you can reproduce them on your computer.
- The Keras 4 Step Workflow - Jun 4, 2018.
In his book "Deep Learning with Python," Francois Chollet outlines a process for developing neural networks with Keras in 4 steps. Let's take a look at this process with a simple example.
- On the contribution of neural networks and word embeddings in Natural Language Processing - May 31, 2018.
In this post I will try to explain, in a very simplified way, how to apply neural networks and integrate word embeddings in text-based applications, and some of the main implicit benefits of using neural networks and word embeddings in NLP.
- Improving the Performance of a Neural Network - May 30, 2018.
There are many techniques available that could help us achieve that. Follow along to get to know them and to build your own accurate neural network.
- Generative Adversarial Neural Networks: Infinite Monkeys and The Great British Bake Off - May 22, 2018.
Adversarial Neural Networks are oddly named since they actually cooperate to make things.
- Top Stories, May 14-20: Data Science vs Machine Learning vs Data Analytics vs Business Analytics; Implement a YOLO Object Detector from Scratch in PyTorch - May 21, 2018.
Also: An Introduction to Deep Learning for Tabular Data; 9 Must-have skills you need to become a Data Scientist, updated; GANs in TensorFlow from the Command Line: Creating Your First GitHub Project; Complete Guide to Build ConvNet HTTP-Based Application
- An Introduction to Deep Learning for Tabular Data - May 17, 2018.
This post will discuss a technique that many people don’t even realize is possible: the use of deep learning for tabular data, and in particular, the creation of embeddings for categorical variables.
- How to Implement a YOLO (v3) Object Detector from Scratch in PyTorch: Part 1 - May 17, 2018.
The best way to go about learning object detection is to implement the algorithms by yourself, from scratch. This is exactly what we'll do in this tutorial.
- GANs in TensorFlow from the Command Line: Creating Your First GitHub Project - May 16, 2018.
In this article I will present the steps to create your first GitHub Project. I will use as an example Generative Adversarial Networks.
- KDnuggets™ News 18:n20, May 16: PyTorch Tensor Basics; Data Science in Finance; Executive Guide to Data Science - May 16, 2018.
PyTorch Tensor Basics; Top 7 Data Science Use Cases in Finance; The Executive Guide to Data Science and Machine Learning; Data Augmentation: How to use Deep Learning when you have Limited Data
- Complete Guide to Build ConvNet HTTP-Based Application using TensorFlow and Flask RESTful Python API - May 15, 2018.
In this tutorial, a CNN is to be built, and trained and tested against the CIFAR10 dataset. To make the model remotely accessible, a Flask Web application is created using Python to receive an uploaded image and return its classification label using HTTP.
Pages: 1 2
- Detecting Breast Cancer with Deep Learning - May 9, 2018.
Breast cancer is the most common invasive cancer in women, and the second main cause of cancer death in women, after lung cancer. In this article I will build a WideResNet based neural network to categorize slide images into two classes, one that contains breast cancer and other that doesn’t using Deep Learning Studio.
- How I Used CNNs and Tensorflow and Lost a Silver Medal in Kaggle Challenge - May 8, 2018.
I joined the competition a month before it ended, eager to explore how to use Deep Natural Language Processing (NLP) techniques for this problem. Then came the deception. And I will tell you how I lost my silver medal in that competition.
- Building Convolutional Neural Network using NumPy from Scratch - Apr 26, 2018.
In this article, CNN is created using only NumPy library. Just three layers are created which are convolution (conv for short), ReLU, and max pooling.
- Why Deep Learning is perfect for NLP (Natural Language Processing) - Apr 20, 2018.
Deep learning brings multiple benefits in learning multiple levels of representation of natural language. Here we will cover the motivation of using deep learning and distributed representation for NLP, word embeddings and several methods to perform word embeddings, and applications.
- Neural Network based Startup Name Generator - Apr 20, 2018.
How to build a recurrent neural network to generate suggestions for your new company’s name.
- Derivation of Convolutional Neural Network from Fully Connected Network Step-By-Step - Apr 19, 2018.
What are the advantages of ConvNets over FC networks in image analysis? How is ConvNet derived from FC networks? Where the term convolution in CNNs came from? These questions are to be answered in this article.
- Are High Level APIs Dumbing Down Machine Learning? - Apr 16, 2018.
Libraries like Keras simplify the construction of neural networks, but are they impeding on practitioners full understanding? Or are they simply useful (and inevitable) abstractions?
- Ten Machine Learning Algorithms You Should Know to Become a Data Scientist - Apr 11, 2018.
It's important for data scientists to have a broad range of knowledge, keeping themselves updated with the latest trends. With that being said, we take a look at the top 10 machine learning algorithms every data scientist should know.
Pages: 1 2
- Getting Started with PyTorch Part 1: Understanding How Automatic Differentiation Works - Apr 11, 2018.
PyTorch has emerged as a major contender in the race to be the king of deep learning frameworks. What makes it really luring is it’s dynamic computation graph paradigm.
Pages: 1 2
- Top 8 Free Must-Read Books on Deep Learning - Apr 10, 2018.
Deep Learning is the newest trend coming out of Machine Learning, but what exactly is it? And how do I learn more? With that in mind, here's a list of 8 free books on deep learning.
- Top 20 Deep Learning Papers, 2018 Edition - Apr 3, 2018.
Deep Learning is constantly evolving at a fast pace. New techniques, tools and implementations are changing the field of Machine Learning and bringing excellent results.
- Implementing Deep Learning Methods and Feature Engineering for Text Data: The Continuous Bag of Words (CBOW) - Apr 3, 2018.
The CBOW model architecture tries to predict the current target word (the center word) based on the source context words (surrounding words).
- A “Weird” Introduction to Deep Learning - Mar 30, 2018.
There are amazing introductions, courses and blog posts on Deep Learning. But this is a different kind of introduction.
Pages: 1 2
- Is ReLU After Sigmoid Bad? - Mar 23, 2018.
Recently [we] were analyzing how different activation functions interact among themselves, and we found that using relu after sigmoid in the last two layers worsens the performance of the model.
- Deep Misconceptions About Deep Learning - Mar 5, 2018.
I hope to clarify some processes to attack DL problems and also discuss why it performs so well in some areas such as Natural Language Processing (NLP), image recognition, and machine-translation while failing at others.
- Age of AI Conference 2018 – Day 2 Highlights - Feb 23, 2018.
Here are some of the highlights from the second day of the Age of AI Conference, February 1, at the Regency Ballroom in San Francisco.
Pages: 1 2
- 5 Fantastic Practical Natural Language Processing Resources - Feb 22, 2018.
This post presents 5 practical resources for getting a start in natural language processing, covering a wide array of topics and approaches.
- Age of AI Conference 2018 – Day 1 Highlights - Feb 21, 2018.
Here are some of the highlights from the first day of the Age of AI Conference, January 31, at the Regency Ballroom in San Francisco.
Pages: 1 2
- KDnuggets™ News 18:n08, Feb 21: Neural network AI is simple – stop pretending you are a genius; Data Science at the command line - Feb 21, 2018.
Want a Job in Data? Learn This; 5 Things You Need To Know About Data Science; Cartoon: Machine Learning Problems in 2118
- Resurgence of AI During 1983-2010 - Feb 16, 2018.
We discuss supervised learning, unsupervised learning and reinforcement learning, neural networks, and 6 reasons that helped AI Research and Development to move ahead.
- Neural network AI is simple. So… Stop pretending you are a genius - Feb 15, 2018.
This post may come off as a rant, but that’s not so much its intent, as it is to point out why we went from having very few AI experts, to having so many in so little time.
- The Birth of AI and The First AI Hype Cycle - Feb 13, 2018.
A dazzling review of AI History, from Alan Turing and Turing Test, to Simon and Newell and Logic Theorist, to Marvin Minsky and Perceptron, birth of Rule-based systems and Machine Learning, Eliza - first chatbot, Robotics, and the bust which led to first AI Winter.
- KDnuggets™ News 18:n06, Feb 7: 5 Fantastic Practical Machine Learning Resources; 8 Must-Know Neural Network Architectures - Feb 7, 2018.
5 Fantastic Practical Machine Learning Resources; The 8 Neural Network Architectures Machine Learning Researchers Need to Learn; Generalists Dominate Data Science; Avoid Overfitting with Regularization; Understanding Learning Rates and How It Improves Performance in Deep Learning
- A Simple Starter Guide to Build a Neural Network - Feb 5, 2018.
This guide serves as a basic hands-on work to lead you through building a neural network from scratch. Most of the mathematical concepts and scientific decisions are left out.
- Understanding Learning Rates and How It Improves Performance in Deep Learning - Feb 1, 2018.
Furthermore, the learning rate affects how quickly our model can converge to a local minima (aka arrive at the best accuracy). Thus getting it right from the get go would mean lesser time for us to train the model.
- The 8 Neural Network Architectures Machine Learning Researchers Need to Learn - Jan 31, 2018.
In this blog post, I want to share the 8 neural network architectures from the course that I believe any machine learning researchers should be familiar with to advance their work.
Pages: 1 2
- My Journey into Deep Learning - Jan 30, 2018.
In this post I’ll share how I’ve been studying Deep Learning and using it to solve data science problems. It’s an informal post but with interesting content (I hope).
- Using Genetic Algorithm for Optimizing Recurrent Neural Networks - Jan 22, 2018.
In this tutorial, we will see how to apply a Genetic Algorithm (GA) for finding an optimal window size and a number of units in Long Short-Term Memory (LSTM) based Recurrent Neural Network (RNN).
- Is Learning Rate Useful in Artificial Neural Networks? - Jan 15, 2018.
This article will help you understand why we need the learning rate and whether it is useful or not for training an artificial neural network. Using a very simple Python code for a single layer perceptron, the learning rate value will get changed to catch its idea.
- Computer Vision by Andrew Ng - 11 Lessons Learned - Dec 22, 2017.
I recently completed Andrew Ng’s computer vision course on Coursera. In this article, I will discuss 11 key lessons that I learned in the course.
- DeepSchool.io: Deep Learning Learning - Dec 22, 2017.
What I truly envision for deep school is that this will build a whole lot of Meetup nodes across the world where people will learn, mentor and network around sharing AI knowledge.
- Deep Learning Made Easy with Deep Cognition - Dec 21, 2017.
So normally we do Deep Learning programming, and learning new APIs, some harder than others, some are really easy an expressive like Keras, but how about a visual API to create and deploy Deep Learning solutions with the click of a button? This is the promise of Deep Cognition.
Pages: 1 2
- The 10 Deep Learning Methods AI Practitioners Need to Apply - Dec 13, 2017.
Deep learning emerged from that decade’s explosive computational growth as a serious contender in the field, winning many important machine learning competitions. The interest has not cooled as of 2017; today, we see deep learning mentioned in every corner of machine learning.
Pages: 1 2
- KDnuggets™ News 17:n47, Dec 13: Top Data Science, Machine Learning Methods in 2017; Main Data Science Developments in 2017, Key Trends; Lunch Break with Keras - Dec 13, 2017.
Also: Managing Machine Learning Workflows with Scikit-learn Pipelines; Best Masters in Data Science and Analytics - Europe Edition; Another Day in the Life of a Data Scientist; TensorFlow for Short-Term Stocks; Creating Simple Data Visualizations as an Act of Kindness
- Today I Built a Neural Network During My Lunch Break with Keras - Dec 8, 2017.
So yesterday someone told me you can build a (deep) neural network in 15 minutes in Keras. Of course, I didn’t believe that at all. So the next day I set out to play with Keras on my own data.
- Some Musings on Capsule Networks and DLPaper2Code - Dec 6, 2017.
Only the Godfather of Deep Learning did it again and came up with something brilliant — adding layers inside existing layers instead of adding more layers i.e nested layers.... giving rise to the Capsule Networks!
- What is a Bayesian Neural Network? - Dec 5, 2017.
BNNs are important in specific settings, especially when we care about uncertainty very much.
- Using Deep Learning to Solve Real World Problems - Dec 4, 2017.
Do you assume that deep learning is only being used for toy problems and in self-learning scenarios? This post includes several firsthand accounts of organizations using deep neural networks to solve real world problems.
- Exploring Recurrent Neural Networks - Dec 1, 2017.
We explore recurrent neural networks, starting with the basics, using a motivating weather modeling problem, and implement and train an RNN in TensorFlow.
- InfoGAN - Generative Adversarial Networks Part III - Nov 30, 2017.
In this third part of this series of posts the contributions of InfoGAN will be explored, which apply concepts from Information Theory to transform some of the noise terms into latent codes that have systematic, predictable effects on the outcome.
- How To Unit Test Machine Learning Code - Nov 28, 2017.
One of the main principles I learned during my time at Google Brain was that unit tests can make or break your algorithm and can save you weeks of debugging and training time.
- Deep Learning Specialization by Andrew Ng – 21 Lessons Learned - Nov 24, 2017.
I found all 3 courses extremely useful and learned an incredible amount of practical knowledge from the instructor, Andrew Ng. Ng does an excellent job of filtering out the buzzwords and explaining the concepts in a clear and concise manner.
- Understanding Objective Functions in Neural Networks - Nov 23, 2017.
This blog post is targeted towards people who have experience with machine learning, and want to get a better intuition on the different objective functions used to train neural networks.
- Key Takeaways from Open Data Science Conference (ODSC) West 2017 - Nov 21, 2017.
This year, the ODSC West was held at the Hyatt Regency San Francisco Airport, from November 2 to 4. I am, attempting here, to give you a snapshot tour of what I experienced.
Pages: 1 2
- Estimating an Optimal Learning Rate For a Deep Neural Network - Nov 21, 2017.
This post describes a simple and powerful way to find a reasonable learning rate for your neural network.
- NVIDIA DGX Systems – Deep Learning Software Whitepaper - Nov 20, 2017.
Download this whitepaper from NVIDIA DGX Systems, and gain insight into the engineering expertise and innovation found in pre-optimized deep learning frameworks available only on NVIDIA DGX Systems and learn how to dramatically reduce your engineering costs using today’s most popular frameworks.
- Generative Adversarial Networks — Part II - Nov 17, 2017.
Second part of this incredible overview of Generative Adversarial Networks, explaining the contributions of Deep Convolutional-GAN (DCGAN) paper.
- Capsule Networks Are Shaking up AI – Here’s How to Use Them - Nov 16, 2017.
If you follow AI you might have heard about the advent of the potentially revolutionary Capsule Networks. I will show you how you can start using them today.
- Overview of GANs (Generative Adversarial Networks) – Part I - Nov 10, 2017.
A great introductory and high-level summary of Generative Adversarial Networks.
- TensorFlow: What Parameters to Optimize? - Nov 9, 2017.
Learning TensorFlow Core API, which is the lowest level API in TensorFlow, is a very good step for starting learning TensorFlow because it let you understand the kernel of the library. Here is a very simple example of TensorFlow Core API in which we create and train a linear regression model.
- Top KDnuggets tweets, Nov 01-07: Airbnb develops an #AI which converts design into source code - Nov 8, 2017.
Also: One LEGO at a time: Explaining the #Math of How #NeuralNetworks Learn; 6 Books Every #DataScientist Should Keep Nearby; Direct from Sebastian Raschka #Python #MachineLearning book, new edition.
- Real World Deep Learning: Neural Networks for Smart Crops - Nov 7, 2017.
The advances in image classification, object detection, and semantic segmentation using deep Convolutional Neural Networks, which spawned the availability of open source tools such as Caffe and TensorFlow (to name a couple) to easily manipulate neural network graphs... made a very strong case in favor of CNNs for our classifier.
- Want to know how Deep Learning works? Here’s a quick guide for everyone - Nov 3, 2017.
Once you’ve read this article, you will understand the basics of AI and ML. More importantly, you will understand how Deep Learning, the most popular type of ML, works.
- KDnuggets™ News 17:n42, Nov 1: 7 Steps to Mastering Deep Learning with Keras; 6 Books Every Data Scientist Should Keep Nearby - Nov 1, 2017.
7 Steps to Mastering Deep Learning with Keras; 6 Books Every Data Scientist Should Keep Nearby; Neural Networks, Step 1: Where to Begin with Neural Nets & Deep Learning; XGBoost: A Concise Technical Overview; AlphaGo Zero: The Most Significant Research Advance in AI
- 7 Steps to Mastering Deep Learning with Keras - Oct 30, 2017.
Are you interested in learning how to use Keras? Do you already have an understanding of how neural networks work? Check out this lean, fat-free 7 step plan for going from Keras newbie to master of its basics as quickly as is possible.
- Neural Networks, Step 1: Where to Begin with Neural Nets & Deep Learning - Oct 28, 2017.
This is a short post for beginners learning neural networks, covering several essential neural networks concepts.
- Hello, World: Building an AI that understands the world through video - Oct 26, 2017.
At TwentyBN, we have created the world’s first AI technology that shows an awareness of its environment and of the actions occurring within it. Our system observes the world through live video and automatically interprets the unfolding visual scene.
- Neural Network Foundations, Explained: Updating Weights with Gradient Descent & Backpropagation - Oct 25, 2017.
In neural networks, connection weights are adjusted in order to help reconcile the differences between the actual and predicted outcomes for subsequent forward passes. But how, exactly, do these weights get adjusted?
- TensorFlow: Building Feed-Forward Neural Networks Step-by-Step - Oct 23, 2017.
This article will take you through all steps required to build a simple feed-forward neural network in TensorFlow by explaining each step in details.
Pages: 1 2 3
- 5 Free Resources for Furthering Your Understanding of Deep Learning - Oct 20, 2017.
This post includes 5 specific video-based options for furthering your understanding of neural networks and deep learning, collectively consisting of many, many hours of insights.
- 7 Types of Artificial Neural Networks for Natural Language Processing - Oct 19, 2017.
What is an artificial neural network? How does it work? What types of artificial neural networks exist? How are different types of artificial neural networks used in natural language processing? We will discuss all these questions in the following article.
Pages: 1 2