Industry Predictions: Key Trends in 2017

With 2017 almost upon us, KDnuggets brings you opinions from industry leaders as to what the relevant and most important 2017 key trends will be.



2017!

2017 Trends to Watch: Big Data report, Ovum

Key 2017 trends:

  • Machine learning will be the biggest disruptor for big data analytics in 2017.
  • Making data science a team sport will become a top priority.
  • IoT use cases will push real-time streaming analytics to the front burner.
  • The cloud will sharpen Hadoop-Spark "co-opetition."
  • Security and data preparation will drive data lake governance.

While machine learning continues to grab the headlines, real-time streaming will become the fastest-growing use case.

A perfect storm has transformed real-time streaming from a niche technology to one with broad, cross-industry appeal. Open source technology has lowered barriers to entry for both technology providers and customers; scalable commodity infrastructure has made the processing of large torrents of real-time data in motion economically and technically feasible.

The explosion in bandwidth and smart-sensor technology has opened up use cases ranging from location-based marketing to health and safety, intrusion detection, and predictive maintenance, appealing to a broad cross section of industries.

Within the next 24 months, Ovum expects that the cloud will pass the halfway mark to dominate new big data deployments.

"Big data has emerged from its infancy to transition from buzzword to urgency for enterprises across all major sectors," said Tony Baer, Principal Analyst for Information Management. "The growing pains are being abetted by machine learning, which will lower barriers to adoption of big data-enabled analytics and solutions, and the growing dominance of the cloud, which will ease deployment hurdles."

Ramon Chen, CMO of Data Management Innovator, Reltio

1. AI and analytics vendor M&A activity will accelerate - There’s no doubt that there’s a massive land grab for anything AI, machine learning or deep learning. With a limited number of startups offering these integrated capabilities, the quest for relevant insights and ultimately recommended actions that can help with predictive and more efficient forecasting and decision-making will lead to even more aggressive M&A activity in 2017.

2. Data lakes will finally become useful - Many companies who took the data lake plunge in the early days have spent a significant amount of money not only buying into the promise of low cost storage and process, but a plethora of services in order to aggregate and make available significant pools of big data to be correlated and uncovered for better insights. With existing big data projects recognizing the need for a reliable data foundation, and new projects being combined into a holistic data management strategy, data lakes may finally fulfill their promise in 2017.

3. Data monetization strategies will start to mature - For enterprises to tap into the data they use to run their businesses as a potential new revenue stream, the data must be reliable, relevant, segmented, secure, anonymized, if necessary, and audited to guarantee ownership of data. Last year, Gartner highlighted that only 10% of CEOs said they monetize information assets by bartering with them or selling them outright. That number, fueled by modern data management technology, is sure to grow in 2017.

4. Cloud and data security agility will gain further importance - This is a rather obvious prediction, given the phobia of data breaches and the reticence of industries such as the financial sector to use public cloud technologies. In 2017, vendors offering Platform as a Service (PaaS) and tools themselves must also do their part in complying to Service Organization Control (SOC) types, as well as in the case of healthcare data, HITRUST (Health Information Trust Alliance), that provides an established security framework that can be used by all organizations that create, access, store or exchange sensitive and regulated data.

5. Systems of Record will have a path to Systems of Engagement, and beyond – In 2011, celebrated author Geoffrey Moore first defined the term Systems of Engagement (SoE), contrasting how Systems of Record (SoR) needed to evolve in order to focus on people, not processes. 2017 may be the year more companies finally shift to SoE, in part due to increased use and adoption of AI. Last year, Geoffrey Moore extended his thinking towards Systems of Intelligence (SoI), combining AI with the big data scale of Internet of Things (IoT). In order to achieve true SoI, companies are now pushing to accelerate SoE in the form of data-driven applications for their workers.

Dr. William Bain, CEO and Founder, ScaleOut Software

In 2017, the need for “operational” intelligence to capture highly dynamic business opportunities will shift the focus of big data from the data warehouse to live systems.

In 2017, we expect to see widespread integration of this exciting capability into live systems.

In 2017, in-memory computing will enter the mainstream as the enabling technology for adding operational intelligence to live systems, and it will supplant legacy streaming technologies.

In 2017, the adoption of in-memory computing technologies, such as in-memory data grids (IMDGs), will provide the enabling technology to capture perishable opportunities and make mission-critical decisions on live data. Driven by the need for real-time analytics, the IMDG market alone – currently estimated at $600 million – will exceed $1 billion by 2018, according to Gartner.

In-memory computing techniques will leverage the power of machine learning to enhance the value of operational intelligence.

The year 2017 will see an accelerated adoption of scenarios that integrate machine learning with the power of in-memory computing, especially in e-commerce systems and the Internet of Things (IoT). In both of these applications, machine learning techniques can dramatically deepen the introspection and enhance operational intelligence.

John Hurley, CEO, SmartFile

One of the trends for 2017 will see more companies replacing tools and processes that don't give any insight into how files are being used and managed. Understanding how files are accessed and shared from both inside and outside of office walls is so critical for protecting internal data as well as the intellectual property that businesses are built on.

The one hurdle we'll face is the c-suite itself, unfortunately. We surveyed our users and found that one of the biggest fears of IT pros is their leadership. According to our results, 41% of IT personnel are worried about the c-suite's promotion of Shadow IT and insecure file sharing practices, along with a general disregard for security measures. This is something we will all need to monitor and improve on in 2017

Girish Pancha, CEO and Founder, StreamSets

2017 will be the year organizations begin to rekindle trust in their data lakes. The “dump it in the data lake” mentality compromises analysis and sows distrust in the data. With so many new and evolving data sources like sensors and connected devices, organizations must be vigilant about the integrity of their data and expect and plan for regular, unanticipated changes to the format of their incoming data. Next year, organizations will begin to change their mindset and look for ways to constantly monitor and sanitize data as it arrives, before it reaches its destination.

Next year, organizations will stop putting IoT data on a pedestal, or, if you like, in a silo. IoT data needs to be correlated with other data streams, tied to historical or master data or run through artificial intelligence algorithms in order to provide business-driving value. Despite the heralded arrival of shiny new tools that can handle IoT’s massive, moving workloads, organizations will realize they need to integrate these new data streams into their existing data management and governance disciplines to gain operational leverage and ensure application trust.

Data is the final frontier in the quest for continuous IT operations. In 2017, we’ll start to see organizations manage data topologies in the same way they have taken to managing modern applications, networks and IT security: as a living breathing operation that must run reliably and automatically on a continuous basis. But to fulfill that quest, businesses will need to look hard at what changes they need to make to their processes, tooling and even organization to ensure the availability and accuracy of their data in motion.

Greta Roberts, CEO, Talent Analytics, Corp.

Several years ago, I began as Program Chair for the inaugural Predictive Analytics World for Workforce. At the time, it was quite difficult finding presenters with interesting Case Studies with measurable business results.  In planning for next May's Conference I am struck by the incredible progress in HR and the workforce.

As we look forward into 2017, we wanted to share our 2017 predictions with you for this high momentum area that we are so proud to be a part of.

  1. There will be an explosion of big data talent acquisition solutions claiming to “predict” if this or that candidate will be perfect for your business. At the same time, businesses will begin to realize these claims might be too good to be true.
  2. Businesses will demand their HR teams implement predictive analytics to solve employee related problems that exist outside of HR.
  3. HR (and the workforce in general) will begin to embrace machine learning for areas like talent acquisition.
  4. Industrial and Organizational (I/O) Psychologists will increasingly embrace data science concepts and methods, bringing a companies data scientists and I/O Psychologists closer together.
  5. The industry will see more integration between Applicant Tracking Software and solutions that deliver Predictions as a Service.
  6. Executives will begin embracing concepts like employee lifetime value and will begin learning how to understand their employees as profit centers vs. cost centers.
  7. HR’s interest in "predicting flight risk” will evolve from predicting flight risk for existing employees to predicting flight risk before you even hire your job candidates.

Mike Stonebraker, Ph.D., Co-founder and CTO, Tamr; Recipient of the 2014 A.M. Turing Award

  • “The first billion dollar ROI on a data integration project will be realized.”
  • “There will continue to be a shortage of qualified data scientists. I don’t expect the market to be in equilibrium until 2019 at the earliest. Every major university will have a data science program in place by 2017.”
  • “At the low end, easy-to-use analytic toolkits will come into their own. At the high end where performance and scalability matters, it will continue to be “rocket science” for another few years.”
  • “Spark still isn’t played out as a technology. Spark will evolve into something that is quite different from what it is today. They will have to address integration with persistent storage and sharing.”

Ihab Ilyas, Co-founder, Tamr; Professor of Computer Science at the University of Waterloo

  • “Companies will have initiatives on how to securely share data sets across silos. This is a gap in the market and several companies will emerge to tackle enterprise data brokerage and sharing.”
  • “Data Analytics will go vertical (financial, medical, etc), and companies that build vertical solutions will dominate the market. General-purpose data analytics companies will start disappearing. Vertical data analytics startups will develop their own full-stack solutions to data collection, preparation and analytics.”
  • “There will be enterprise programs to commercially adopt Deep Learning as a technique. There is way too much structure to ignore and way less trying data to create reasonable models. AI in the shape of personal assistants and chat bots will replace the hype of cognitive computing.”

Michael O'Connell, Chief Analytics Officer, TIBCO
Some key developments in 2016 included

  • More emphasis on “representative data”, rigorous analytics and data science to identify insights and understand business issues.
  • Mainstreaming of machine learning and predictive analytics for business, customer and engineering applications.
  • Rise in deep learning, beyond the big internet companies, especially for some specialized applications e.g. fraud in the banking system.
  • Continued rise in engineering analytics – especially in IIoT applications, where anomaly detection is foundational.
  • Significant uptick in “systems of insight”, where insights from analytics are transformed in to notifications, alerts and actions on the business.
  • Continued migration to governed data discovery across the corporate landscape – providing self-service, but with guidance and best practices; along with performance, governance and security.
  • Beginnings of hybrid cloud adoption with scalable tenant resources and contextual routing, along with hybrid data and elastic compute engines.

For 2017, I see more action in all these areas, especially in “systems of insight” – turning insights from visual and predictive analytics in to actions. This includes real-time streaming analytics for rapid intervention and action, at moments of truth in business processes. I also see less of a boundary between data preparation and visual analytics, as data wrangling and insight discovery become more intertwined.

Related: