Each machine learning model is trying to solve a problem with a different objective using a different dataset and hence, it is important to understand the context before choosing a metric.
It wasn’t until I wrote my own simple Blockchain, that I truly understood what it is and the potential applications for it. So without further ado, lets set up our 7 functions!
We bring to you the top 16 open source deep learning libraries and platforms. TensorFlow is out in front as the undisputed number one, with Keras and Caffe completing the top three.
Deep learning brings multiple benefits in learning multiple levels of representation of natural language. Here we will cover the motivation of using deep learning and distributed representation for NLP, word embeddings and several methods to perform word embeddings, and applications.
The tough thing about learning data is remembering all the syntax. While at Dataquest we advocate getting used to consulting the Python documentation, sometimes it's nice to have a handy reference, so we've put together this cheat sheet to help you out!
What are the advantages of ConvNets over FC networks in image analysis? How is ConvNet derived from FC networks? Where the term convolution in CNNs came from? These questions are to be answered in this article.
First part on a full discussion on how to do Distributed Deep Learning with Apache Spark. This part: What is Spark, basics on Spark+DL and a little more.
In this article, we will focus on the modern trends that took off well on the market by the end of 2017 and discuss the major breakthroughs expected in 2018.
This is a summary of 12 key lessons that machine learning researchers and practitioners have learned include pitfalls to avoid, important issues to focus on and answers to common questions.
PyTorch has emerged as a major contender in the race to be the king of deep learning frameworks. What makes it really luring is it’s dynamic computation graph paradigm.
Just like we discussed in the CBOW model, we need to model this Skip-gram architecture now as a deep learning classification model such that we take in the target word as our input and try to predict the context words.
Despite the fact that an ETL task is pretty challenging when it comes to loading Big Data, there’s still the scenario in which you can load terabytes of data from Postgres into BigQuery relatively easy and very efficiently.
No other mean of data description is more comprehensive than Descriptive Statistics and with the ever increasing volumes of data and the era of low latency decision making needs, its relevance will only continue to increase.
The power of charts to assist in accurate interpretation is massive and that's why it is vital to select the correct type when you are trying to visualize data.
In this article, we demonstrate the utility of using native NumPy file format .npy over CSV for reading large numerical data set. It may be an useful trick if the same CSV data file needs to be read many times.