KDnuggets Home » News » 2015 » Apr » News, Features » Top /r/MachineLearning Posts, Apr 5-11: Amazon Machine Learning, Numerical Optimization, and Conditional Random Fields ( 15:n11 )

Top /r/MachineLearning Posts, Apr 5-11: Amazon Machine Learning, Numerical Optimization, and Conditional Random Fields


Amazon Machine Learning as a Service, Numerical Optimization, Extracting data from NYTimes recipes, Intro to Machine Learning with sci-kit, and more.



Grant Marshall

Amazon Machine Learning This week on /r/MachineLearning, Amazon announces its new ML service, we learn about conditional random fields, and we see a probabilistic theory of deep learning.

1. Introducing Amazon Machine Learning – Make Data-Driven Decisions at Scale +88

This is a big announcement – Amazon is getting into the ML-as-a-service game with this offering. This service makes most of the machine learning process available through an online web app. It goes through choices like training/testing ratios for your input data to the choice of evaluation metric. If you’re looking for an end-to-end analytics solution, it will be interesting to see how this platform develops.

2. Numerical Optimization: Understanding L-BFGS +56

This article acts as an introduction to L-BFGS for numerical optimization. Considering the importance of numerical optimization for machine learning, this is a good topic to dive into. Give this a read if it matches your interests.

3. Extracting Structured Data From Recipes Using Conditional Random Fields +56

recipes-crf This New York Times pieces dives into extracting structured data with CRFs. In the post, they use their service NYT Cooking as the source of data to show how to use CRFs for extracting structured data. This is an interesting case study and is presented very well.

4. New video series: Intro to machine learning with scikit-learn (Kaggle) +38

This video series adds a new way to learn about using scikit-learn for machine learning. It comes from Kaggle, and as you might expect, this gives it a very practical bend. If you’re a beginner and looking to start doing Kaggle challenges or any other sort of machine learning tasks from scratch, this is a good series.

5. A Probabilistic Theory of Deep Learning (x-post: r/Compressivesensing) +33

This blog post goes into the probabilistic theory behind deep learning. This is an interesting post if you’re more well versed in probability theory and want to learn about how it relates to deep learning. The author links to an Arxiv PDF addressing these ideas that will appear to probability theorists out there.

Related:

Sign Up

By subscribing you accept KDnuggets Privacy Policy