Implementing Neural Networks in Javascript

Javascript is one of the most prevalent and fastest growing languages in existence today. Get a quick introduction to implementing neural networks in the language, and direction on where to go from here.



By Tine Wiederer, webkid.io.

Neural networks provide the possibility to solve complicated non linear problems. They can be used in various areas such as signal classification, forecasting timeseries and pattern recognition. A neural network is a model inspired by the human brain and consists of multiple connected neurons. The network consists of a layer of input neurons (where the information goes in), a layer of output neurons (where the result can be taken from) and a number of so called hidden layers in between:

neuronal network scheme

For getting a deeper understanding, I recommend checking out Neural Networks and Deep Learning.

Within the last years, multiple Javascript frameworks were developed that can help you to create, train and use Neural Networks for different purposes. In this blog post, you will learn how to set up a Network and use it for classifying images.

A common example for getting started with Neural Networks is the classification of handwritten digits. To achieve good results, a network has to be trained properly. Therefore, we need a set of data called a training set. In our example, we will use the MNIST numbers, a set of thousands of 28x28px binary images of handwritten numbers from 0 to 9:


mnist numbers

The MNIST database containing 60,000 examples for training and 10,000 examples for testing can be downloaded from LeCun’s website. Instead of downloading the database and converting the data to actual images, we can use the helpful library MNIST digits, which creates test- and training sets automatically.

const mnist = require('mnist'); 

const set = mnist.set(700, 20);

const trainingSet = set.training;
const testSet = set.test;


The above code creates a training set of 700 images and a testset with 20 elements. When creating the sets manually, it is important to make sure there are no duplicate elements in the sets. If you’re using the MNIST digits library, this is checked automatically.

After creating the data for training and testing, we can set up the network. We will use the library synaptic.js, which gives us the possibility to create a neural network and configure various parameters. First of all, we have to determine how many input and output neurons are needed. As the size of each image is 28x28px, the number of pixels the network has to take as input is 28 x 28 = 784. The digits should be assigned to one of ten classes, so the number of output neurons will be 10. Furthermore, the network should have at least one hidden layer, which in this example is set to consist of 100 neurons.

The following code sets up the network described above:

const synaptic = require('synaptic');

const Layer = synaptic.Layer;
const Network = synaptic.Network;
const Trainer = synaptic.Trainer;

const inputLayer = new Layer(784);
const hiddenLayer = new Layer(100);
const outputLayer = new Layer(10);

inputLayer.project(hiddenLayer);
hiddenLayer.project(outputLayer);

const myNetwork = new Network({
    input: inputLayer,
    hidden: [hiddenLayer],
    output: outputLayer
});


To train the network with our training set, we can use the Trainer provided by synaptic.js. The train() function takes the data used for the training and a list of parameters for the configuration of the trainer.

const trainer = new Trainer(myNetwork);
trainer.train(trainingSet, {
    rate: .2,
    iterations: 20,
    error: .1,
    shuffle: true,
    log: 1,
    cost: Trainer.cost.CROSS_ENTROPY
});


As options, you can set the ‘rate’, which is the learning rate for the training. The ‘iterations’ define, after how many iterations the training determines. The ‘error’ is the minimum error that can be reached during the training, if it is achieved, the training stops. By setting the ‘shuffle’ option, you can specify if the training set is ordered randomly or not. You can find more detailed information about all possible options in the synaptic.js documentation.

To get the general idea, in this example we set the maximum number of iterations to 20 to make sure we don’t have to wait for hours before the training is finished. Note that the network won’t be trained very well after only 20 iterations. To get better results, you’ll have to increase the number and be patient.

If you want to see the progress of training, you can use the ‘log’ option to print out the current error of the network. The lower the error, the better the network is trained.


error log

You might notice that the error fluctuates a bit instead of shrinking constantly, but on the whole it gets smaller. If this is not the case, you can try working with a smaller learning rate.

After the training is finished, the testset can be used to check how good the classification by the network works. For this, we can use theactivate() function of the network, which takes the element to classify as parameter. To check the result, it can be printed out and compared to the expected output.

console.log(myNetwork.activate(testSet[0].input));
console.log(testSet[0].output);


Please note that the results achieved by the network in this example are not great. Only around 50% of the elements in the training set get classified correctly. This is because of the small training set and the low number of iterations used in the training. To improve the results, the number of elements in the training set and the iterations for the training should be increased. Warning: This will cause the training to take quite a while longer!

Besides from the common MNIST example, there are many other applications in which neural networks can be useful. Another interesting experiment that uses synaptic.js is the T-Rex ML Player in which a neural network learns the T-Rex game.

If you want to have a look at similar Javascript libraries for creating neural networks, here are some other projects you should check out:

Bio: Tine Wiederer studied Media and Computing in Berlin. Now she works as a data scientist and front-end developer at webkid.io.

Original. Reposted with permission.

Related: