Top /r/MachineLearning Posts, Apr 511: Amazon Machine Learning, Numerical Optimization, and Conditional Random Fields
Amazon Machine Learning as a Service, Numerical Optimization, Extracting data from NYTimes recipes, Intro to Machine Learning with scikit, and more.
Grant Marshall
This week on /r/MachineLearning, Amazon announces its new ML service, we learn about conditional random fields, and we see a probabilistic theory of deep learning.
1. Introducing Amazon Machine Learning – Make DataDriven Decisions at Scale +88
This is a big announcement – Amazon is getting into the MLasaservice game with this offering. This service makes most of the machine learning process available through an online web app. It goes through choices like training/testing ratios for your input data to the choice of evaluation metric. If you’re looking for an endtoend analytics solution, it will be interesting to see how this platform develops.
2. Numerical Optimization: Understanding LBFGS +56
This article acts as an introduction to LBFGS for numerical optimization. Considering the importance of numerical optimization for machine learning, this is a good topic to dive into. Give this a read if it matches your interests.
3. Extracting Structured Data From Recipes Using Conditional Random Fields +56
This New York Times pieces dives into extracting structured data with CRFs. In the post, they use their service NYT Cooking as the source of data to show how to use CRFs for extracting structured data. This is an interesting case study and is presented very well.
4. New video series: Intro to machine learning with scikitlearn (Kaggle) +38
This video series adds a new way to learn about using scikitlearn for machine learning. It comes from Kaggle, and as you might expect, this gives it a very practical bend. If you’re a beginner and looking to start doing Kaggle challenges or any other sort of machine learning tasks from scratch, this is a good series.
5. A Probabilistic Theory of Deep Learning (xpost: r/Compressivesensing) +33
This blog post goes into the probabilistic theory behind deep learning. This is an interesting post if you’re more well versed in probability theory and want to learn about how it relates to deep learning. The author links to an Arxiv PDF addressing these ideas that will appear to probability theorists out there.
Related:
This week on /r/MachineLearning, Amazon announces its new ML service, we learn about conditional random fields, and we see a probabilistic theory of deep learning.
1. Introducing Amazon Machine Learning – Make DataDriven Decisions at Scale +88
This is a big announcement – Amazon is getting into the MLasaservice game with this offering. This service makes most of the machine learning process available through an online web app. It goes through choices like training/testing ratios for your input data to the choice of evaluation metric. If you’re looking for an endtoend analytics solution, it will be interesting to see how this platform develops.
2. Numerical Optimization: Understanding LBFGS +56
This article acts as an introduction to LBFGS for numerical optimization. Considering the importance of numerical optimization for machine learning, this is a good topic to dive into. Give this a read if it matches your interests.
3. Extracting Structured Data From Recipes Using Conditional Random Fields +56
This New York Times pieces dives into extracting structured data with CRFs. In the post, they use their service NYT Cooking as the source of data to show how to use CRFs for extracting structured data. This is an interesting case study and is presented very well.
4. New video series: Intro to machine learning with scikitlearn (Kaggle) +38
This video series adds a new way to learn about using scikitlearn for machine learning. It comes from Kaggle, and as you might expect, this gives it a very practical bend. If you’re a beginner and looking to start doing Kaggle challenges or any other sort of machine learning tasks from scratch, this is a good series.
5. A Probabilistic Theory of Deep Learning (xpost: r/Compressivesensing) +33
This blog post goes into the probabilistic theory behind deep learning. This is an interesting post if you’re more well versed in probability theory and want to learn about how it relates to deep learning. The author links to an Arxiv PDF addressing these ideas that will appear to probability theorists out there.
Related:
Top Stories Past 30 Days

