KDnuggets Home » News » 2016 » Jan » News, Features » Top /r/MachineLearning Posts, December: The Secret Sauce, OpenAI, Google vs. Facebook ( 16:n01 )

Top /r/MachineLearning Posts, December: The Secret Sauce, OpenAI, Google vs. Facebook


December on /r/MachineLearning: Is TensorFlow Google's "secret sauce?", AI leaders unite, an extensive curated list of machine learning resources grows, Google vs. Facebook, and Deep Q Pong.



Reddit Machine Learning In December on /r/MachineLearning, TensorFlow stays in the spotlight, The AI Avengers unite, an awesome curated list of resources grows, a friendly challenge is issued, and neural nets play the latest video game craze... Pong!

1. TensorFlow: "Machine learning algorithms aren't the secret sauce. The data is the secret sauce." +333

This article explores the question, "Why did Google open source its core machine learning algorithms?", referring, of course, to TensorFlow. The head of Google's spam algorithms, Matt Cutts, has referred to to TensorFlow as Google's "secret sauce," which could make the move to open source more difficult to understand, at least for some. But when you view the symbiotic relationship between algorithm and data through a less mainstream lens, you can identify the data as the real "secret sauce," and that freeing algorithms can get all sorts of engineers working on perfecting them, while Google is able to reap the benefits of both the greatest source of secret sauce on the planet and perfected algorithms from The Wild. The article finishes with a discussion on the openness of data, and raises the question as to whether or not this trend toward unbrindled data sharing, given that it is the secret sauce, continues.

The Secret Sauce

2. OpenAI, a non-profit AI research company +311

OpenAI is a newly-launched initiative, involving Musk, Bengio, Sutskever, Brockman, and a Who's Who of other AI personalities, with the goal of advancing digital intelligence in a manner congruous with benefiting humanity, without the pressing constraint of financial return. With a $1 billion commitment from the wider community of AI leaders, OpenAI seeks to focus on positively impacting humanity, while protecting us from the looming Skynet threat. While I say this in jest, it is worth noting that OpenAI takes safeguarding against the potential hazards of AI seriously. In the near term, OpenAI hopes to pursue research and work toward the next set of breakthroughs. We wish OpenAI luck as they seek real justice by - and from - artificial intelligence.

Incidentally, the OpenAI Research Team will be doing a Reddit AMA on January 9, 2016.

3. A curated list of Machine Learning and Deep Learning resources +274

This "awesome" list of { machine | deep } learning resources has been gaining traction over the past several months, and has been a mainstay on social media for some time. Hosted on Github, resources are organized by category, including tutorials, articles, interview information, Kaggle resources, Quora resources, etc. Even if you have seen the list before, it's worth looking at again, given its commit history. Have a great resource of your own that isn't on the list? Consider contributing.

4. Friendly challenge for Facebook and Google: Play each other in a game of Go +270

Redditor feedthecreed shares the idea that Facebook and Google should play each other in a friendly game of the ancient Chinese game of Go. Notoriously difficult to tame with artificial intelligence, Go is considered by some as a defining challenge for AI. The OP references this Wired article, which outlines claims by both companies that they have made recent advances in "beating" Go, and so a competition would be a logical move. So, come on Facebook and Google! Ready, set...

5. Deep Q Network built in TensorFlow plays Atari Pong +263

This article appeared only a few days before the end of the month, but still garnered the fifth-most number of upvotes, leading me to believe that, had it been posted a week earlier, it would have been the runaway winner of the month. Regardless, with projects like these, TensorFlow will stay in the headlines and on everyone's tongues for the foreseeable future.

The link above is to the Reddit post, including the lengthy comment thread which includes additional gems and information from the implementer. The video of the neural network training is shown below. The code can be found here, as can an academic write-up of the network architecture and experiments. More on Q-learning can be found here.



Bio: Matthew Mayo is a computer science graduate student currently working on his thesis parallelizing machine learning algorithms. He is also a student of data mining, a data enthusiast, and an aspiring machine learning scientist.

Related: