KDnuggets Home » News » 2017 » Feb » Opinions, Interviews » Fixing Deployment and Iteration Problems in CRISP-DM ( 17:n05 )

Fixing Deployment and Iteration Problems in CRISP-DM


Many analytic models are not deployed effectively into production while others are not maintained or updated. Applying decision modeling and decision management technology within CRISP-DM addresses this.



This is the final of three articles. The first outlined Four Problems in Using CRISP-DM And How To Fix Them, while the second discussed Bringing Business Clarity To CRISP-DM. This article discusses the two related problems of how analytics are deployed and managed once the analytic work is complete. A focus on the business decision-making involved helps address both problems. In CRISP-DM this involves changes to the Deployment phase and the iteration loop as shown in Figure 1.

Figure 1: Deployment and management activities in CRISP-DM.

A depressingly large number of analytic models that would help the business never actually do so. Many analytic models get developed but are not deployed and used in a reasonable time-frame (or indeed at all). As the survey results on a recent International Institute for Analytics webinar showed, 60% of projects failed to act on their analytics in “months”. Results from other surveys and research suggest that for many of these analytics, months turn into years and that many analytics are never deployed at all. Teams that want to develop analytics that make a difference need to include deploying their analytic into production, and integrating it with their business environment, as a critical step in the project. This is what the Deployment phase of CRISP-DM is for.Several elements go into successful deployment.

The team must select an appropriate analytic delivery option. The style of analytic (is it visual or numeric, fixed or interactive for instance) must match the style of decision making intended. An automated decision, embedded in a website, requires a different approach to a manual decision being made by a call center rep. An analytic that must be used to make a decision by a mobile worker is different from one that supports a desktop worker and so on. A decision model reveals the organizational impact and shows whose behavior will have to change as well as the systems or processes will have to be re-coded around the analytical decision.

Analytic teams must also remember that decision-making requires an analytic model to be wrapped with decision-making logic to make it actionable.  A decision model shows how to do this, modeling the non-analytic elements of the decision-making as well as showing how the analytic is used by the decision. For instance, a decision model for claims processing might include a fraud prediction as well as audit, regulatory and policy-based exclusions.

Finally, there is the organizational change component of the project. Changing the way people make decisions, or changing the degree to which their systems make decisions for them, is almost always complex. A decision model provides a framework for understanding who is making which decision where and when. It also ties this decision-making to new and existing business metrics. This provides a clear picture of the organizational change that will be necessary.

Successful analytic teams don’t consider the project complete until the business is seeing improvement – producing the analytic is a way point not the destination.

Solution: Plan for analytic evolution

Policies and regulations change. Markets and competitors react. What makes a good decision today might not make a good decision tomorrow. Analytics also age as data changes and must be monitored and kept up to date. An analytic model that is deployed and left unmonitored and unchanged risks becoming less effective and potentially even harmful.

Analytic teams need to identify the business and environmental factors that might cause an analytic model to become less effective. They must define the data that needs to be captured to show that the analytic works and to identify when it starts to work less well for any of these reasons.They should consider if the machine learning and adaptive analytic techniques are appropriate to continually refine the model and what boundary or alert conditions the business might require for such models. If manual updates are going to be required, then the schedule or triggers for these updates need to be defined. A clear plan should be developed for evolving each analytic deployed.

A decision model shows the structure of a decision. This structure can be used to determine what interim results and business outcomes should be tracked. Combined with data about the model and its behavior, this allows ongoing monitoring and improvement of both the analytic model and the business decision it influences.

The decision model also shows clearly which external business influences matter to the decision-making so that these can be watched for change.For instance, new or changed regulations may influence how the constraints on a decision and so undermine the effectiveness of an analytic unless changes are made.

Successful analytic teams are focused on evolving, industrial scale analytic portfolios, not individually hand-crafted one-time models

To see how one global leader in information technology is using decision modeling, check out this Leading Practices Brief from the International Institute for Analytics.

Bio: James Taylor is the leading expert in how to use analytic technology to build Decision Management Systems that help companies improve decision making and develop an agile, analytic and adaptive business. He provides strategic consulting to companies of all sizes, working with clients in all sectors to adopt decision modeling, analytics and other decision making technology.

Related: