5 Deep Learning Projects You Can No Longer Overlook

There are a number of "mainstream" deep learning projects out there, but many more niche projects flying under the radar. Have a look at 5 such projects worth checking out.



Deep learning libraries and frameworks such as Theano, Keras, Caffe, and TensorFlow have gained enormous recent popularity. In fact, Google's TensorFlow is the most starred machine learning repository on Github. By a lot. TensorFlow, despite being in the wild for little more than 6 months, has captured such a formidable market share that one could argue that it has become the default deep learning library by a large swath of seasoned neural network veterans and newcomers alike.

It's not the only library to consider, obviously. There are many others, a few of which are mentioned above. But there a many more smaller projects, ranging from complete libraries implemented form scratch, to high-level building blocks that sit atop established deep learning projects to fit particular niches. Below you can find a mix of these project types, noted for a variety of reasons as encountered over time spent online.

Deep learning

Maybe you find something that fills a need for you in this list of 5 deep learning projects you should not overlook any longer. Items are in no particular order, but I like to number things, and so number things I shall.

1. Leaf

Leaf is a neural network framework, described in its Github repo README as:

Open Machine Intelligence Framework for Hackers. (GPU/CPU)

Somewhat interestingly, Leaf, which is quite a new project but has already gathered over 4000 repo stars, is written in Rust. Rust, itself, is only about 6 years old, with development sponsored by Mozilla. For those unfamiliar with Rust, it is a systems language with similarities to C and C++, self-described as:

Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety.

Leaf

A book, Leaf Machine Learning for Hackers, is freely-available online, and is likely a good first stop for those looking to give Leaf a try. I would guess that Leaf won't gain a lot of converts from outside of the Rust ecosystem, even given the claims and quantitative support that Leaf is faster than most other similar frameworks out there (see the above image). However, the number of Rust users continue to grow, and no doubt some of them will be interested in building neural nets. It's good to know they have a quality native framework to employ in this pursuit.

2. tiny-cnn

From tiny-cnn's Github repo:

tiny-cnn is a C++11 implementation of deep learning. It is suitable for deep learning on limited computational resource, embedded systems and IoT devices.

tiny-cnn is relatively quick without a GPU; it boasts a 98.8% accuracy on MNIST in 13 minutes of CPU training. It's also simple to use. Since it's header-only, you simply include the tiny_cnn.h header and write your C++ code, with nothing else to install. tiny-cnn supports a whole host of network architectures, activation functions, and optimization algorithms.

Here's a quick example of constructing a Multi-layer Perceptron:

#include "tiny_cnn/tiny_cnn.h"
using namespace tiny_cnn;
using namespace tiny_cnn::activation;

void construct_mlp() {
    auto mynet = make_mlp<tan_h>({ 32 * 32, 300, 10 });

    assert(mynet.in_data_size() == 32 * 32);
    assert(mynet.out_data_size() == 10);
}


Check out the documentation, as well as this project using tiny-cnn to implement a convolutional neural net implementation in Android. If you are set on implementing neural networks in C++, this is worth checking out.

3. Layered

Layered is authored by independent machine learning researcher Danijar Hafner, who recently contributed to KDnuggets the article "Introduction to Recurrent Networks in TensorFlow."

Layered is:

Clean implementation of feed forward neural networks.

Hafner wrote Layered in Python 3 as a "clean and modular implementation of feed forward neural networks." He states that he undertook the project as a means for better understanding deep learning concepts himself, and recommends doing so if you are interested in gaining a real appreciation of how deep neural networks actually function.

Here is an example of a simple neural network implementation in Layered:

from layered.network import Network
from layered.activation import Identity, Relu, Softmax

num_inputs = 784
num_outputs = 10

network = Network([
    Layer(num_inputs, Identity),
    Layer(700, Relu),
    Layer(500, Relu),
    Layer(300, Relu),
    Layer(num_outputs, Softmax),
])


The project currently supports identity, rectifiers, sigmoid, and softmax activation functions, and squared error and cross-entropy cost functions. As such, if you are looking to see a no-nonsense, from-scratch implementation of neural network functionality, Layered would be a good place to start. The fact that the project also works and is actively developed are a pair of reasons to do more than use it as a learning tool.

Hafner also has a number of tutorials on practical deep learning with TensorFlow, which I encourage you to have a look at.

4. Brain

I recently shared some Javascript machine learning libraries of relevance, a list which included 3 neural network libraries (out of 5 total). Brain could have made that list, but it has been added here in order to provide some diversity.

Brain is a neural network library written in Javascript, for use in the browser or with Node. The project is actively developed, having grown out of a previous, less mature project of the same name.

Here is an example of approximating the exclusive or (XOR) function with Brain:

var net = new brain.NeuralNetwork();

net.train([{input: [0, 0], output: [0]},
{input: [0, 1], output: [1]},
{input: [1, 0], output: [1]},
{input: [1, 1], output: [0]}]);

var output = net.run([1, 0]); // [0.987]


Brain supports hidden layers, and by default uses one (unless otherwise specified). Training a network is easy (shown above), with options easily being set and passed as a hash:

net.train(data, {
  errorThresh: 0.005,  // error threshold to reach
  iterations: 20000,   // maximum training iterations
  log: true,           // console.log() progress periodically
  logPeriod: 10,       // number of iterations between logging
  learningRate: 0.3    // learning rate
})


Train also returns a hash with training outcomes. Networks can be serialized via JSON.

If you are a Javascript developer looking to implement neural networks, Brain may be the library you need. You may also want to check out the above mentioned article, which contains some general purpose machine learning libraries along with a few additional neural networks libraries.

5. neon

Fast, scalable, easy-to-use Python based Deep Learning Framework by Nervana.

neon is indisputably fast: "For fast iteration and model exploration, neon has the fastest performance among deep learning libraries." This is definitely the reason to give this library a look, if you are currently unfamiliar with it.

neon

Developed by Nervana Systems, neon supports convolution, RNN, LSTM, GRUs, and more. neon has a lot more going for it as well: it has a great workflow overview, its documentation is thorough, and it has a number of useful tutorials.

You can also check out a number of Jupyter notebook versions of the tutorials from a neon Deep Learning meetup, which is nice. If speed in training neural networks is important to you, and you're in the Python ecosystem, check out neon.

Related: