Update: Google TensorFlow Deep Learning Is Improving

The recent open sourcing of Google's TensorFlow was a significant event for machine learning. While the original release was lacking in some ways, development continues and improvements are already being made.



3. They released a pre-trained TensorFlow model trained on the ImageNet Dataset

So, while it's good to know that TensorFlow is getting faster, and that Google is utilizing deep learning at continually higher levels, this is actually where things get a bit more eyebrow-raising. Hammerbacher brings to our attention, via Dean and Vinyals' slides, that Google has released its best ImageNet pre-trained image classifier yet as a TensorFlow model, described in this paper, achieving 3.5% top-5 error on the validation set. Google has released code optimized for both desktop and mobile environments, which begins to put the aims of a wide and varied TensorFlow device support base into focus.

Classifying Grace Hopper Image
Example image classification.

There is a Google Research blog post detailing how to classify images with TensorFlow, as well as a tutorial on utilizing the pre-trained image classifier on the TensorFlow site.

Two points become very clear from this preceding news.

First, Google deep learning projects are going to continue to come fast and furious, with boundaries being pushed further and further, and at quicker paces. Deep learning is quite likely the edge of what we have long envisioned as "true AI," and Google is the forefront of this research. We will probably soon get to a point where rapid gains in previously-unsolvable problems will become passé, which is ridiculous to think, given the state of the technology a mere 5 years ago.

Second, Google is invested in TensorFlow. I would suspect that future deep learning innovations from Google will be accompanied solely by publicly-available TensorFlow code, which makes its mainstream adaptability inevitable, being delayed by a constant factor related only to however long it takes to get the distributed version open sourced. Deep learning researchers should get used to typing

   import tensorflow as tf

as quickly as possible.

TensorFlow recap and beyond

TensorFlow Logo Here are a few important points of review regarding TensorFlow, for the uninitiated, as well as a few additional points of where interested parties may like to go with it from here.

First, you can read more about the technical underpinnings of TensorFlow here in its whitepaper. TensorFlow can be downloaded from its official GitHub repo here.

To get up to speed with TensorFlow, first check out its official tutorials. Some additional examples can be found here, and a set of tutorials based on these Theano tutorials are available here.

From the video below, of a talk given at a recent Bay Area Machine Learning Symposium, Dean covers several topics, and takes some questions from the audience. The video comes before the TensorFlow open-sourcing announcement, but provides a high-level overview of the system.



Lastly, an innovative implementation of a convolutional neural network for text classification implemented in TensorFlow can be found here (with code located here. Given that many of the existing deep learning research breakthroughs of late have focused on images, it's nice to see something text-related implemented in a deep learning tutorial.

It's clear that a stripped down version was released into the wild on November 9th, as opposed to the beefy in-house implementation running much of Google's current deep learning research and production systems. However, given that it's developed and supported by Google, TensorFlow is not going anywhere, and, supported by this tutorial by Dean and Vinyals, some of its top minds are actively developing the external product as well as the proprietary version. Some initial concerns with the software seem to be actively being developed, meaning that TensorFlow will find its way into deep learning research and production far and wide.

Bio: Matthew Mayo is a computer science graduate student currently working on his thesis parallelizing machine learning algorithms. He is also a student of data mining, a data enthusiast, and an aspiring machine learning scientist.

Related: