KDnuggets Home » News » 2015 » Oct » Software » Introducing: Blocks and Fuel – Frameworks for Deep Learning in Python ( 15:n35 )

Introducing: Blocks and Fuel – Frameworks for Deep Learning in Python


Blocks and Fuel are machine learning frameworks for Python developed by the Montreal Institute of Learning Algorithms (MILA) at the University of Montreal. Blocks is built upon Theano (also by MILA) and allows for rapid prototyping of neural network models. Fuel serves as a data processing pipeline and data interface for Blocks.



neural-network-visualised

Theano

Before introducing Blocks, a description of Theano will provide an understanding of the functionality being extended. Theano is mathematical expression compiler that allows a user to define, optimize and evaluate mathematical expressions. These expressions are programmed symbolically before Theano stores them in a computational graph – allowing for automatic differentiation and interpretation into C and Cuda (GPU) code. Gradient calculation is central to many learning algorithms, therefore automatic differentiation rapidly improves prototyping times where compilation into C and Cuda allow for execution speeds that rival hand coded C solutions and greatly surpass them if a GPU is employed.

theano-computational-graph

Blocks

Blocks, like many other frameworks provides reusable components for building and training neural models. Unlike other frameworks, instead of introducing abstract concepts like ‘layers’ or ‘models’ Blocks directly annotates the Theano computational graph. Blocks’ components are called Bricks, which relate to Theano shared variables – structures central to models like a weight matrices or convolutional filters. Bricks can contain other Bricks and adds a hierarchical dimension to the otherwise flat Theano computational graph, capturing the hierarchical nature of neural models.

For model building and training Blocks provisions many common neural network operations such as non-linear activations like ReLUs as well as state of the art ‘step rules’ for gradient descent, including AdaGrad, ADADELTA, Adam and RMSProp. To avoid overfitting Blocks allows for monitoring on a validation set as well as incorporating well-known methods of regularisation like weight decay, weight noise and dropout. It is also possible to serialise models during training, allowing later resumption of the training process, which is very pertinent given that training can often take many hours or even days.

blocks-fuel-deep-learnning-model-construct-example

Fuel

Fuel provides a standard interface to interact with and iterate over data in a variety of formats and pre-process data on the fly. HDF5 is the standard data format utilised and enables meta-data and annotations to be added to datasets through the Fuel API. During training, datasets can be manipulated in sequential or shuffled mini-batches and these processes can be extended to in-memory or out of core data. Sampling methods of cross-validation and bootstrapping are also provisioned in addition to pre-processing methods like image cropping and extracting n-grams from text. These functions can be easily extended as well as pipelined for more complicated processing.

blocks-fuel-deep-learnning-fuel-data-load-example

Conclusion

Blocks and Fuel build upon a growing eco-system of Python deep learning frameworks like PyLearn2, GroundHog (both developed by MILA), Lasagne and Keras, as well as others like Caffe for C++, Mocha for Julia, DeepLearning4J for Java and Torch for LuaJIT. Blocks and Fuel certainly look like promising complementary frameworks, but the only way to know for certain is to try them out!

blocks-fuel-deep-learnning-model-train-example



Related: