The Imminent Future of Predictive Modeling

Predictive modeling tools and services are undergoing an inevitable step-change which will free data scientists to focus on applications and insight, and result in more powerful and robust models than ever before. Amongst the key enabling technologies are new hugely scalable cross-validation frameworks, and meta-learning.

by Dr. Justin Washtell, CTO & Co-founder ForecastThis Inc.

Original article from the Predictive Analytics Times

Over the past two to three years there has been a small explosion of companies offering cloud-based Machine Learning as a Service (MLaaS) and Predictive Analytics as a Service (PAaaS). IBM and Microsoft both have major freemium offerings in the form of Watson Analytics and Azure Machine Learning respectively, with companies like BigML, Ayasdi, LogicalGlue and ErsatzLabs occupying the smaller end of the spectrum.

These are services which allow a data owner to upload data and rapidly build predictive or descriptive models, on the cloud, with a minimum of data science expertise.

Yet as quickly as this has happened, there is already a step-change afoot. As somebody working on enabling technologies in this area, I believe it is no overstatement to say that applied machine learning is undergoing a significant evolution right now - one which represents an inevitable step on the route to truly automatic general purpose predictive modeling. Examples of providers championing this new approach include Satalia, DataRobot, Codilime, and the company for whom I work, ForecastThis.

Unlike conventional MLaaS and PAaaS offerings (as much as any sector that has emerged within the last few years can be described as "conventional"), the technology at the heart of these new services is not based upon any one algorithmic approach. Rather, these services draw upon a huge and diverse range of algorithms and parameters to identify those which optimally model the problem at hand - often combining algorithms in the process.

PAW Chicago 2015

Why automate predictive modeling?

Almost precisely a year ago, Dr. Mikio L. Braun of the Berlin Technical University published an article in which he details four reasons why automation is unlikely to transform predictive modeling any time soon:

  1. It's all too easy to make silly mistakes when doing data science
  2. It's easy to observe good results which aren't actually supported by the evidence, by using insufficiently robust methods
  3. Once cannot know in advance which approaches will work best, nor comprehensively test all possible approaches
  4. The No Free Lunch theorem suggests that a single automated solution is not possible

Remarkably, what Dr. Braun gives here are four reasons precisely why it can and must happen!

Let's break that down...

Read the Entire Article

To learn more about predictive modeling and big data, register to attend one of the upcoming analytics events in Chicago, June 8-11, 2015. Sign up with early bird pricing and discount code KDN150 by April 24th to receive up to $550 off of registration.

PAW Business
PAW Manufacturing
eMetrics Summit
Apply to Attend
Predictive Analytics Times

Rising Media Inc.
211 E. Victoria Street, Suite E
Santa Barbara, CA 93101
Produced by:
Rising Media Logo
Prediction Impact