- The Most Complete Guide to PyTorch for Data Scientists - Sep 24, 2020.
All the PyTorch functionality you will ever need while doing Deep Learning. From an Experimentation/Research Perspective.
- KDnuggets™ News 20:n36, Sep 23: New Poll: What Python IDE / Editor you used the most in 2020?; Automating Every Aspect of Your Python Project - Sep 23, 2020.
New Poll: What Python IDE / Editor you used the most in 2020?; Automating Every Aspect of Your Python Project; Autograd: The Best Machine Learning Library You're Not Using?; Implementing a Deep Learning Library from Scratch in Python; Online Certificates/Courses in AI, Data Science, Machine Learning; Can Neural Networks Show Imagination?
- Implementing a Deep Learning Library from Scratch in Python - Sep 17, 2020.
A beginner’s guide to understanding the fundamental building blocks of deep learning platforms.
- Can Neural Networks Show Imagination? DeepMind Thinks They Can - Sep 16, 2020.
DeepMind has done some of the relevant work in the area of simulating imagination in deep learning systems.
- Autograd: The Best Machine Learning Library You’re Not Using? - Sep 16, 2020.
If there is a Python library that is emblematic of the simplicity, flexibility, and utility of differentiable programming it has to be Autograd.
- AI Papers to Read in 2020 - Sep 10, 2020.
Reading suggestions to keep you up-to-date with the latest and classic breakthroughs in AI and Data Science.
- How Do Neural Networks Learn? - Aug 17, 2020.
With neural networks being so popular today in AI and machine learning development, they can still look like a black box in terms of how they learn to make predictions. To understand what is going on deep in these networks, we must consider how neural networks perform optimization.
- Batch Normalization in Deep Neural Networks - Aug 7, 2020.
Batch normalization is a technique for training very deep neural networks that normalizes the contributions to a layer for every mini batch.
- Deep Learning for Signal Processing: What You Need to Know - Jul 27, 2020.
Signal Processing is a branch of electrical engineering that models and analyzes data representations of physical events. It is at the core of the digital world. And now, signal processing is starting to make some waves in deep learning.
- KDnuggets™ News 20:n28, Jul 22: Data Science MOOCs are too Superficial; The Bitter Lesson of Machine Learning - Jul 22, 2020.
Data Science MOOCs are too Superficial; The Bitter Lesson of Machine Learning; Building a REST API with Tensorflow Serving (Part 1); 3 Advanced Python Features You Should Know; Understanding How Neural Networks Think;
- Free MIT Courses on Calculus: The Key to Understanding Deep Learning - Jul 8, 2020.
Calculus is the key to fully understanding how neural networks function. Go beyond a surface understanding of this mathematics discipline with these free course materials from MIT.
- PyTorch for Deep Learning: The Free eBook - Jul 7, 2020.
For this week's free eBook, check out the newly released Deep Learning with PyTorch from Manning, made freely available via PyTorch's website for a limited time. Grab it now!
- Learning by Forgetting: Deep Neural Networks and the Jennifer Aniston Neuron - Jun 25, 2020.
DeepMind’s research shows how to understand the role of individual neurons in a neural network.
- The Most Important Fundamentals of PyTorch you Should Know - Jun 18, 2020.
PyTorch is a constantly developing deep learning framework with many exciting additions and features. We review its basic elements and show an example of building a simple Deep Neural Network (DNN) step-by-step.
- Introduction to Convolutional Neural Networks - Jun 3, 2020.
The article focuses on explaining key components in CNN and its implementation using Keras python library.
- 5 Machine Learning Papers on Face Recognition - May 28, 2020.
This article will highlight some of that research and introduce five machine learning papers on face recognition.
- Are Tera Operations Per Second (TOPS) Just hype? Or Dark AI Silicon in Disguise? - May 27, 2020.
This article explains why TOPS isn’t as accurate a gauge as many people think, and discusses other criteria that should be considered when evaluating a solution to a real application.
- Satellite Image Analysis with fast.ai for Disaster Recovery - May 14, 2020.
We were asked to build ML models using the novel xBD dataset provided by the organizers to estimate damage to infrastructure with the goal of reducing the amount of human labour and time required to plan an appropriate response. This article will focus on the technical aspects of our solution and share our experiences.
- DeepMind’s Suggestions for Learning #AtHomeWithAI - May 13, 2020.
DeepMind has been sharing resources for learning AI at home on their Twitter account. Check out a few of these suggestions here, and keep your eye on the #AtHomeWithAI hashtag for more.
- 5 Concepts You Should Know About Gradient Descent and Cost Function - May 7, 2020.
Why is Gradient Descent so important in Machine Learning? Learn more about this iterative optimization algorithm and how it is used to minimize a loss function.
- Deep Learning: The Free eBook - May 4, 2020.
"Deep Learning" is the quintessential book for understanding deep learning theory, and you can still read it freely online.
- Introducing Brain Simulator II: A New Platform for AGI Experimentation - Apr 29, 2020.
A growing consensus of researchers contend that new algorithms are needed to transform narrow AI to AGI. Brain Simulator II is free software for new algorithm development targeted at AGI that you can experiment with and participate in its development.
- LSTM for time series prediction - Apr 27, 2020.
Learn how to develop a LSTM neural network with PyTorch on trading data to predict future prices by mimicking actual values of the time series data.
- OpenAI Open Sources Microscope and the Lucid Library to Visualize Neurons in Deep Neural Networks - Apr 17, 2020.
The new tools shows the potential of data visualizations for understanding features in a neural network.
- Build PyTorch Models Easily Using torchlayers - Apr 9, 2020.
torchlayers aims to do what Keras did for TensorFlow, providing a higher-level model-building API and some handy defaults and add-ons useful for crafting PyTorch neural networks.
- 10 Must-read Machine Learning Articles (March 2020) - Apr 9, 2020.
This list will feature some of the recent work and discoveries happening in machine learning, as well as guides and resources for both beginner and intermediate data scientists.
- 3 Reasons to Use Random Forest® Over a Neural Network: Comparing Machine Learning versus Deep Learning - Apr 8, 2020.
Both the random forest algorithm and Neural Networks are different techniques that learn differently but can be used in similar domains. Why would you use one over the other?
- Graph Neural Network model calibration for trusted predictions - Mar 24, 2020.
In this article, we’ll talk about calibration in graph machine learning, and how it can help to build trust in these powerful new models.
- Build an Artificial Neural Network From Scratch: Part 2 - Mar 20, 2020.
The second article in this series focuses on building an Artificial Neural Network using the Numpy Python library.
- Generate Realistic Human Face using GAN - Mar 10, 2020.
This article contain a brief intro to Generative Adversarial Network(GAN) and how to build a Human Face Generator.
- TensorFlow 2.0 Tutorial: Optimizing Training Time Performance - Mar 5, 2020.
Tricks to improve TensorFlow training time with tf.data pipeline optimizations, mixed precision training and multi-GPU strategies.
- Recreating Fingerprints using Convolutional Autoencoders - Mar 4, 2020.
The article gets you started working with fingerprints using Deep Learning.
- KDnuggets™ News 20:n07, Feb 19: 20 AI, Data Science, Machine Learning Terms for 2020; Why Did I Reject a Data Scientist Job? - Feb 19, 2020.
This week on KDnuggets: 20 AI, Data Science, Machine Learning Terms You Need to Know in 2020; Why Did I Reject a Data Scientist Job?; Fourier Transformation for a Data Scientist; Math for Programmers; Deep Neural Networks; Practical Hyperparameter Optimization; and much more!
- Deep Neural Networks - Feb 14, 2020.
We examine the features and applications of a deep neural network.
- A bird’s-eye view of modern AI from NeurIPS 2019 - Jan 28, 2020.
With the explosion of the field of AI/ML impacting so many applications and industries, there is great value coming out of recent progress. This review highlights many research areas covered at the NeurIPS 2019 conference recently held in Vancouver, Canada, and features many important areas of progress we expect to see in the coming year.
- Microsoft Introduces Project Petridish to Find the Best Neural Network for Your Problem - Jan 20, 2020.
The new algorithm takes a novel approach to neural architecture search.
- Uber Creates Generative Teaching Networks to Better Train Deep Neural Networks - Jan 13, 2020.
The new technique can really improve how deep learning models are trained at scale.
- Top KDnuggets tweets, Dec 18-30: A Gentle Introduction to Math Behind Neural Networks - Dec 31, 2019.
A Gentle Introduction to #Math Behind #NeuralNetworks; Learn How to Quickly Create UIs in Python; I wanna be a data scientist, but... how!?; I created my own deepfake in two weeks
- Fighting Overfitting in Deep Learning - Dec 27, 2019.
This post outlines an attack plan for fighting overfitting in neural networks.
- Random Forest® vs Neural Networks for Predicting Customer Churn - Dec 26, 2019.
Let us see how random forest competes with neural networks for solving a real world business problem.
- 5 Techniques to Prevent Overfitting in Neural Networks - Dec 6, 2019.
In this article, I will present five techniques to prevent overfitting while training neural networks.
- Enabling the Deep Learning Revolution - Dec 5, 2019.
Deep learning models are revolutionizing the business and technology world with jaw-dropping performances in one application area after another. Read this post on some of the numerous composite technologies which allow deep learning its complex nonlinearity.
- KDnuggets™ News 19:n45, Nov 27: Interpretable vs black box models; Advice for New and Junior Data Scientists - Nov 27, 2019.
This week: Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead; Advice for New and Junior Data Scientists; Python Tuples and Tuple Methods; Can Neural Networks Develop Attention? Google Thinks they Can; Three Methods of Data Pre-Processing for Text Classification
- Can Neural Networks Develop Attention? Google Thinks they Can - Nov 25, 2019.
Google recently published some work about modeling attention mechanisms in deep neural networks.
- Neural Networks 201: All About Autoencoders - Nov 21, 2019.
Autoencoders can be a very powerful tool for leveraging unlabeled data to solve a variety of problems, such as learning a "feature extractor" that helps build powerful classifiers, finding anomalies, or doing a Missing Value Imputation.
- Deep Learning for Image Classification with Less Data - Nov 20, 2019.
In this blog I will be demonstrating how deep learning can be applied even if we don’t have enough data.
- Generalization in Neural Networks - Nov 18, 2019.
When training a neural network in deep learning, its performance on processing new data is key. Improving the model's ability to generalize relies on preventing overfitting using these important methods.
- Research Guide for Depth Estimation with Deep Learning - Nov 12, 2019.
In this guide, we’ll look at papers aimed at solving the problems of depth estimation using deep learning.
- Designing Your Neural Networks - Nov 4, 2019.
Check out this step-by-step walk through of some of the more confusing aspects of neural nets to guide you to making smart decisions about your neural network architecture.
- Build an Artificial Neural Network From Scratch: Part 1 - Nov 1, 2019.
This article focused on building an Artificial Neural Network using the Numpy Python library.
- KDnuggets™ News 19:n40, Oct 23: How to Become a (Good) Data Scientist; Writing Your First Neural Net in 30 Lines with Keras - Oct 23, 2019.
Read useful advice on how to become a good data scientist; see how you can write your 1st neural net in under 30 lines of Keras code; Understand why AI salaries are heading skywards and what skills you need for them; and read about key ideas and methods in anomaly detection
- This Microsoft Neural Network can Answer Questions About Scenic Images with Minimum Training - Oct 21, 2019.
Recently, a group of AI experts from Microsoft Research published a paper proposing a method for scene understanding that combines two key tasks: image captioning and visual question answering (VQA).
- Writing Your First Neural Net in Less Than 30 Lines of Code with Keras - Oct 18, 2019.
Read this quick overview of neural networks and learn how to implement your first in very few lines using Keras.
- Research Guide for Video Frame Interpolation with Deep Learning - Oct 15, 2019.
In this research guide, we’ll look at deep learning papers aimed at synthesizing video frames within an existing video.
- Using Neural Networks to Design Neural Networks: The Definitive Guide to Understand Neural Architecture Search - Oct 14, 2019.
A recent survey outlined the main neural architecture search methods used to automate the design of deep learning systems.
- Activation maps for deep learning models in a few lines of code - Oct 10, 2019.
We illustrate how to show the activation maps of various layers in a deep CNN model with just a couple of lines of code.
- Introduction to Artificial Neural Networks - Oct 8, 2019.
In this article, we’ll try to cover everything related to Artificial Neural Networks or ANN.
- Research Guide for Neural Architecture Search - Oct 4, 2019.
In this guide, we will explore a range of research papers that have sought to solve the challenging task of automating neural network design.
- Recreating Imagination: DeepMind Builds Neural Networks that Spontaneously Replay Past Experiences - Oct 3, 2019.
DeepMind researchers created a model to be able to replay past experiences in a way that simulate the mechanisms in the hippocampus.
- A Gentle Introduction to PyTorch 1.2 - Sep 20, 2019.
This comprehensive tutorial aims to introduce the fundamentals of PyTorch building blocks for training neural networks.
- KDnuggets™ News 19:n32, Aug 28: Handy SQL Features for Data Scientists; Nothing but NumPy: Creating Neural Networks with Computational Graphs - Aug 28, 2019.
Most useful SQL features for Data Scientist; Excellent tutorial on creating neural nets from scratch with Numpy; TensorFlow 2.0 highlights, explained; How to sell your boss on Data Analytics; and more.
- Nothing but NumPy: Understanding & Creating Neural Networks with Computational Graphs from Scratch - Aug 23, 2019.
Entirely implemented with NumPy, this extensive tutorial provides a detailed review of neural networks followed by guided code for creating one from scratch with computational graphs.
- Pytorch Lightning vs PyTorch Ignite vs Fast.ai - Aug 16, 2019.
Here, I will attempt an objective comparison between all three frameworks. This comparison comes from laying out similarities and differences objectively found in tutorials and documentation of all three frameworks.
- Keras Callbacks Explained In Three Minutes - Aug 9, 2019.
A gentle introduction to callbacks in Keras. Learn about EarlyStopping, ModelCheckpoint, and other callback functions with code examples.
- 9 Tips For Training Lightning-Fast Neural Networks In Pytorch - Aug 9, 2019.
Who is this guide for? Anyone working on non-trivial deep learning models in Pytorch such as industrial researchers, Ph.D. students, academics, etc. The models we're talking about here might be taking you multiple days to train or even weeks or months.
- Deep Learning for NLP: ANNs, RNNs and LSTMs explained! - Aug 7, 2019.
Learn about Artificial Neural Networks, Deep Learning, Recurrent Neural Networks and LSTMs like never before and use NLP to build a Chatbot!
- Top KDnuggets tweets, Jul 24-30: Nothing but NumPy: Understanding and Creating Neural Nets w. Computational Graphs from Scratch; How Netflix works - Jul 31, 2019.
How Netflix works: the (hugely simplified) complex stuff that happens every time; Top Certificates and Certifications in Analytics, Data Science, ML; Nothing but NumPy: Understanding &Creating Neural Networks with Computation.
- Convolutional Neural Networks: A Python Tutorial Using TensorFlow and Keras - Jul 26, 2019.
Different neural network architectures excel in different tasks. This particular article focuses on crafting convolutional neural networks in Python using TensorFlow and Keras.
- A Gentle Introduction to Noise Contrastive Estimation - Jul 25, 2019.
Find out how to use randomness to learn your data by using Noise Contrastive Estimation with this guide that works through the particulars of its implementation.
- Neural Code Search: How Facebook Uses Neural Networks to Help Developers Search for Code Snippets - Jul 24, 2019.
Developers are always searching for answers to questions about their code. But how do they ask the right questions? Facebook is creating new NLP neural networks to help search code repositories that may advance information retrieval algorithms.
- This New Google Technique Help Us Understand How Neural Networks are Thinking - Jul 24, 2019.
Recently, researchers from the Google Brain team published a paper proposing a new method called Concept Activation Vectors (CAVs) that takes a new angle to the interpretability of deep learning models.
- Training a Neural Network to Write Like Lovecraft - Jul 11, 2019.
In this post, the author attempts to train a neural network to generate Lovecraft-esque prose, known to be awkward and irregular at best. Did it end in success? If not, any suggestions on how it might have? Read on to find out.
- Evolving Deep Neural Networks - Jun 18, 2019.
This article reviews how evolutionary algorithms have been proposed and tested as a competitive alternative to address a number of issues related to neural network design.
- How to Automate Hyperparameter Optimization - Jun 12, 2019.
A step-by-step guide into performing a hyperparameter optimization task on a deep learning model by employing Bayesian Optimization that uses the Gaussian Process. We used the gp_minimize package provided by the Scikit-Optimize (skopt) library to perform this task.
- KDnuggets™ News 19:n22, Jun 12: The Modern Open-Source Data Science/Machine Learning Ecosystem; Simplifying the Data Visualisation Process in Python - Jun 12, 2019.
The 6 tools in the modern open-source Data Science ecosystem; Simplifying the Data Visualisation Process in Python; The Infinity Stones of Data Science; Best resources for developers transitioning into data science.
- Random Forests® vs Neural Networks: Which is Better, and When? - Jun 7, 2019.
Random Forests and Neural Network are the two widely used machine learning algorithms. What is the difference between the two approaches? When should one use Neural Network or Random Forest?
- Understanding Backpropagation as Applied to LSTM - May 30, 2019.
Backpropagation is one of those topics that seem to confuse many once you move past feed-forward neural networks and progress to convolutional and recurrent neural networks. This article gives you and overall process to understanding back propagation by giving you the underlying principles of backpropagation.
- How the Lottery Ticket Hypothesis is Challenging Everything we Knew About Training Neural Networks - May 30, 2019.
The training of machine learning models is often compared to winning the lottery by buying every possible ticket. But if we know how winning the lottery looks like, couldn’t we be smarter about selecting the tickets?
- Building a Computer Vision Model: Approaches and datasets - May 20, 2019.
How can we build a computer vision model using CNNs? What are existing datasets? And what are approaches to train the model? This article provides an answer to these essential questions when trying to understand the most important concepts of computer vision.
- Large-Scale Evolution of Image Classifiers - May 16, 2019.
Deep neural networks excel in many difficult tasks, given large amounts of training data and enough processing power. The neural network architecture is an important factor in achieving a highly accurate model... Techniques to automatically discover these neural network architectures are, therefore, very much desirable.
- Graduating in GANs: Going From Understanding Generative Adversarial Networks to Running Your Own - Apr 25, 2019.
Read how generative adversarial networks (GANs) research and evaluation has developed then implement your own GAN to generate handwritten digits.
Pages: 1 2
- Attention Craving RNNS: Building Up To Transformer Networks - Apr 24, 2019.
RNNs let us model sequences in neural networks. While there are other ways of modeling sequences, RNNs are particularly useful. RNNs come in two flavors, LSTMs (Hochreiter et al, 1997) and GRUs (Cho et al, 2014)
Pages: 1 2
- KDnuggets™ News 19:n14, Apr 10: Which Data Science/ML methods and algorithms you used? Predict Age and Gender Using Neural Nets - Apr 10, 2019.
Getting started with NLP using the PyTorch framework; Building a Recommender System; Advice for New Data Scientists; All you need to know about text preprocessing for NLP and Machine Learning; Advanced Keras - Constructing Complex Custom Losses and Metrics; Top 8 Data Science Use Cases in Gaming
- Advanced Keras — Constructing Complex Custom Losses and Metrics - Apr 8, 2019.
In this tutorial I cover a simple trick that will allow you to construct custom loss functions in Keras which can receive arguments other than
- Training a Champion: Building Deep Neural Nets for Big Data Analytics - Apr 4, 2019.
Introducing Sisense Hunch, the new way of handling Big Data sets that uses AQP technology to construct Deep Neural Networks (DNNs) which are trained to learn the relationships between queries and their results in these huge datasets.
- Getting started with NLP using the PyTorch framework - Apr 3, 2019.
We discuss the classes that PyTorch provides for helping with Natural Language Processing (NLP) and how they can be used for related tasks using recurrent layers.
- Which Face is Real? - Apr 2, 2019.
Which Face Is Real? was developed based on Generative Adversarial Networks as a web application in which users can select which image they believe is a true person and which was synthetically generated. The person in the synthetically generated photo does not exist.
- Feature Reduction using Genetic Algorithm with Python - Mar 25, 2019.
This tutorial discusses how to use the genetic algorithm (GA) for reducing the feature vector extracted from the Fruits360 dataset in Python mainly using NumPy and Sklearn.
Pages: 1 2
- Checklist for Debugging Neural Networks - Mar 22, 2019.
Check out these tangible steps you can take to identify and fix issues with training, generalization, and optimization for machine learning models.
- Artificial Neural Networks Optimization using Genetic Algorithm with Python - Mar 18, 2019.
This tutorial explains the usage of the genetic algorithm for optimizing the network weights of an Artificial Neural Network for improved performance.
Pages: 1 2
- Advanced Keras — Accurately Resuming a Training Process - Mar 14, 2019.
This article on practical advanced Keras use covers handling nontrivial cases where custom callbacks are used.
- AI: Arms Race 2.0 - Mar 12, 2019.
An analysis of the current state of the competition between US, Europe, and China in AI, examining research, patent publications, global datasphere, devices and IoT, people, and more.
- Breaking neural networks with adversarial attacks - Mar 7, 2019.
We develop an intuition behind "adversarial attacks" on deep neural networks, and understand why these attacks are so successful.
- Neural Networks with Numpy for Absolute Beginners — Part 2: Linear Regression - Mar 7, 2019.
In this tutorial, you will learn to implement Linear Regression for prediction using Numpy in detail and also visualize how the algorithm learns epoch by epoch. In addition to this, you will explore two layer Neural Networks.
Pages: 1 2
- Neural Networks seem to follow a puzzlingly simple strategy to classify images - Mar 5, 2019.
We explain why state-of-the-art Deep Neural Networks can still recognize scrambled images perfectly well and how this helps to uncover a puzzlingly simple strategy that DNNs seem to use to classify natural images.
- Neural Networks with Numpy for Absolute Beginners: Introduction - Mar 5, 2019.
In this tutorial, you will get a brief understanding of what Neural Networks are and how they have been developed. In the end, you will gain a brief intuition as to how the network learns.
- Comparing MobileNet Models in TensorFlow - Mar 1, 2019.
MobileNets are a family of mobile-first computer vision models for TensorFlow, designed to effectively maximize accuracy while being mindful of the restricted resources for an on-device or embedded application.
- TensorFlow.js: Machine learning for the web and beyond - Feb 28, 2019.
- How to do Everything in Computer Vision - Feb 27, 2019.
The many standard tasks in computer vision all require special consideration: classification, detection, segmentation, pose estimation, enhancement and restoration, and action recognition. Let me show you how to do everything in Computer Vision with Deep Learning!
- Artificial Neural Network Implementation using NumPy and Image Classification - Feb 21, 2019.
This tutorial builds artificial neural network in Python using NumPy from scratch in order to do an image classification application for the Fruits360 dataset
Pages: 1 2
- Deep Multi-Task Learning – 3 Lessons Learned - Feb 15, 2019.
We share specific points to consider when implementing multi-task learning in a Neural Network (NN) and present TensorFlow solutions to these issues.
- A comprehensive survey on graph neural networks - Feb 15, 2019.
This article summarizes a paper which presents us with a broad sweep of the graph neural network landscape. It’s a survey paper, so you’ll find details on the key approaches and representative papers, as well as information on commonly used datasets and benchmark performance on them.
- Neural Networks – an Intuition - Feb 7, 2019.
Neural networks are one of the most powerful algorithms used in the field of machine learning and artificial intelligence. We attempt to outline its similarities with the human brain and how intuition plays a big part in this.
- NLP Overview: Modern Deep Learning Techniques Applied to Natural Language Processing - Jan 8, 2019.
Trying to keep up with advancements at the overlap of neural networks and natural language processing can be troublesome. That's where the today's spotlighted resource comes in.
- The Backpropagation Algorithm Demystified - Jan 2, 2019.
A crucial aspect of machine learning is its ability to recognize error margins and to interpret data more precisely as rising numbers of datasets are fed through its neural network. Commonly referred to as backpropagation, it is a process that isn’t as complex as you might think.
- Supervised Learning: Model Popularity from Past to Present - Dec 28, 2018.
An extensive look at the history of machine learning models, using historical data from the number of publications of each type to attempt to answer the question: what is the most popular model?
- BERT: State of the Art NLP Model, Explained - Dec 26, 2018.
BERT’s key technical innovation is applying the bidirectional training of Transformer, a popular attention model, to language modelling. It has caused a stir in the Machine Learning community by presenting state-of-the-art results in a wide variety of NLP tasks.
- The brain as a neural network: this is why we can’t get along - Dec 19, 2018.
This article sets out to answer the question: what insights can we gain about ourselves by thinking of the brain as a machine learning model?
- Deep Learning Cheat Sheets - Nov 28, 2018.
Check out this collection of high-quality deep learning cheat sheets, filled with valuable, concise information on a variety of neural network-related topics.
- Using a Keras Long Short-Term Memory (LSTM) Model to Predict Stock Prices - Nov 21, 2018.
LSTMs are very powerful in sequence prediction problems because they’re able to store past information. This is important in our case because the previous price of a stock is crucial in predicting its future price.
- Introduction to PyTorch for Deep Learning - Nov 7, 2018.
In this tutorial, you’ll get an introduction to deep learning using the PyTorch framework, and by its conclusion, you’ll be comfortable applying it to your deep learning models.
- Mastering the Learning Rate to Speed Up Deep Learning - Nov 6, 2018.
Figuring out the optimal set of hyperparameters can be one of the most time consuming portions of creating a machine learning model, and that’s particularly true in deep learning.
- Introduction to Deep Learning with Keras - Oct 29, 2018.
In this article, we’ll build a simple neural network using Keras. Now let’s proceed to solve a real business problem: an insurance company wants you to develop a model to help them predict which claims look fraudulent.
Pages: 1 2
- Generative Adversarial Networks – Paper Reading Road Map - Oct 24, 2018.
To help the others who want to learn more about the technical sides of GANs, I wanted to share some papers I have read in the order that I read them.
- The Main Approaches to Natural Language Processing Tasks - Oct 17, 2018.
Let's have a look at the main approaches to NLP tasks that we have at our disposal. We will then have a look at the concrete NLP tasks we can tackle with said approaches.
- [Webinar] Neural Network Fundamentals - Oct 16, 2018.
In this webinar, Oct 25, 2018, 10:00 am PST, we will apply your convolutional neural network using the ImageNet scenario. We will also review some of the ImageNet architectures and how convolutions work.
- Sequence Modeling with Neural Networks – Part I - Oct 3, 2018.
In the context of this post, we will focus on modeling sequences as a well-known data structure and will study its specific learning framework.
- How to Create a Simple Neural Network in Python - Oct 2, 2018.
The best way to understand how neural networks work is to create one yourself. This article will demonstrate how to do just that.
- More Effective Transfer Learning for NLP - Oct 1, 2018.
Until recently, the natural language processing community was lacking its ImageNet equivalent — a standardized dataset and training objective to use for training base models.
- Introduction to Deep Learning - Sep 28, 2018.
I decided to begin to put some structure in my understanding of Neural Networks through this series of articles.
- Power Laws in Deep Learning 2: Universality - Sep 26, 2018.
It is amazing that Deep Neural Networks display this Universality in their weight matrices, and this suggests some deeper reason for Why Deep Learning Works.
- 6 Steps To Write Any Machine Learning Algorithm From Scratch: Perceptron Case Study - Sep 20, 2018.
Writing a machine learning algorithm from scratch is an extremely rewarding learning experience. We highlight 6 steps in this process.
- Power Laws in Deep Learning - Sep 20, 2018.
In pretrained, production quality DNNs, the weight matrices for the Fully Connected (FC ) layers display Fat Tailed Power Law behavior.
- Data Augmentation For Bounding Boxes: Rethinking image transforms for object detection - Sep 19, 2018.
Data Augmentation is one way to battle this shortage of data, by artificially augmenting our dataset. In fact, the technique has proven to be so successful that it's become a staple of deep learning systems.
Pages: 1 2
- Everything You Need to Know About AutoML and Neural Architecture Search - Sep 13, 2018.
So how does it work? How do you use it? What options do you have to harness that power today? Here’s everything you need to know about AutoML and NAS.
- Machine Learning Cheat Sheets - Sep 11, 2018.
Check out this collection of machine learning concept cheat sheets based on Stanord CS 229 material, including supervised and unsupervised learning, neural networks, tips & tricks, probability & stats, and algebra & calculus.
- Training with Keras-MXNet on Amazon SageMaker - Sep 10, 2018.
In this post, you will learn how to train Keras-MXNet jobs on Amazon SageMaker. I’ll show you how to build custom Docker containers for CPU and GPU training, configure multi-GPU training, pass parameters to a Keras script, and save the trained models in Keras and MXNet formats.
Pages: 1 2
- Neural Networks and Deep Learning: A Textbook - Sep 7, 2018.
This book covers both classical and modern models in deep learning. The book is intended to be a textbook for universities, and it covers the theoretical and algorithmic aspects of deep learning.
- AI Knowledge Map: How To Classify AI Technologies - Aug 31, 2018.
What follows is then an effort to draw an architecture to access knowledge on AI and follow emergent dynamics, a gateway of pre-existing knowledge on the topic that will allow you to scout around for additional information and eventually create new knowledge on AI.
- Auto-Keras, or How You can Create a Deep Learning Model in 4 Lines of Code - Aug 17, 2018.
Auto-Keras is an open source software library for automated machine learning. Auto-Keras provides functions to automatically search for architecture and hyperparameters of deep learning models.