10 Data Analytics Predictions for 2016

Monday Jan 4th 2016 by Ann All
Share:

As companies grow more reliant on data analytics, technology must keep pace with changing business needs, say data experts.

The last few years have seen some big changes in data analytics, with companies creating new business models and shaking up existing ones based on increasingly sophisticated uses of data. For example, health insurer Humana has made analytics the centerpiece of its clinical operations and customer relations efforts. It uses predictive analytics models to identify members who would benefit from regular contact with clinical professionals, helping them coordinate care and making needed lifestyle changes.

Not surprisingly, then, when we asked several data experts for their outlooks on data analytics for the coming year, we got some pretty interesting insights.

Changes in Self-Service Data Analytics

Despite investments in Hadoop, data warehouses and marts and self-service visualization products, enterprises will continue to struggle with satisfying the analytical requirements of everyday business users and analysts, said Satyen Sangani, CEO and co-founder of information management startup Alation. Challenges include data literacy, time to discovering the insight, data availability and data quality.

"A variety of tools will come in to the self-service mainstream, including data catalogs to help people discover data, data preparation tools to help people manipulate data and advanced analytical packages to help people leverage the power of data for discovery and prediction," Sangani said.

Data Infrastructure Consolidation

Sangani predicts that data infrastructure players will consolidate further in 2016, with deep-pocketed enterprise software giants like Oracle and Microsoft acquiring cash-poor data companies, to add new accretive growth streams. "Keep an eye on companies with price-to-sales ratios of less than five as potential targets for this year's buys," he said.

Public Cloud Platforms Gaining Momentum

With the success of Amazon Web Services (AWS), Google Cloud and others, it's clear that many companies are now comfortable storing their data and building data-driven applications on public cloud platforms, said Manish Sood, CEO and founder of data management company Reltio. "While some industries remained cautious, the cost and elasticity benefits cannot be ignored," he said. "AWS's recent announcement of Quicksight is yet another example set to potentially disrupt the multi-billion dollar BI and analytics market."

Hadoop Will Get Thrown for a Loop

While interest in 10-year-old open source technology Hadoop remains strong and usage is maturing, there are lots of new options that either complement or provide an alternative to Hadoop for Big Data processing, said Sood. "The rapid ascension of Apache Spark and Apache Drill are examples. We'll continue to see more options in the New Year," he noted.

Spark will reinvigorate Hadoop, said Monte Zweben, co-founder and CEO of Splice Machine, a provider of relational database management system (RDBMS) technology. He predicts that in 2016, nine out of every 10 projects on Hadoop will involve the use of Spark. In addition, Spark will "kill" Map Reduce, he said. "Map Reduce is quite esoteric. Its slow, batch nature and high level of complexity can make it unattractive for many enterprises. Spark, because of its speed, is much more natural, mathematical, and convenient for programmers."

Operational, Analytical Data Systems Team up

While an accepted best practice has been to keep operational and analytic systems separate, in order to prevent analytic workloads from disrupting operational processing, an emerging practice called HTAP (hybrid transaction/analytical processing) will become more accepted this year, said John Schroeder, CEO and co-founder of MapR, a provider of Hadoop technology. Gartner coined the term in 2014 to describe a new generation of data platforms that can perform both online transaction processing (OLTP) and online analytical processing(OLAP) without requiring data duplication.

"In 2016, we will see converged approaches become mainstream as leading companies reap the benefits of combining production workloads with analytics to adjust quickly to changing customer preferences, competitive pressures and business conditions," Schroeder said. "This convergence speeds the 'data to action' cycle for organizations and removes the time lag between analytics and business impact."

Data-driven applications such as LinkedIn and Facebook have for years delivered a single unified contextual experience that combines both analytical relevant insight with operational application execution, said Reltio's Sood. "The enterprise will see an accelerated adoption of data-driven applications to solve their most pressing challenges," he said."

Tagging Data at Source Boosts Big Data Value

Most businesses are still struggling to extract value from the data they gather because they largely look at the data in isolation, said Lasse Andresen, CTO at ForgeRock, a provider of identity and access management software.

"In order to make sense of Big Data, it must be examined within the context it was collected," Andresen said. "By tagging data at the point of collection with additional contextual information, the value that can be extracted from it across an organization is multiplied significantly. Key factors such as where and when the data was collected or who/what it was collected from are central to understanding data more effectively. Consent, context, identity and security data points will all significantly boost the value of Big Data exponentially."

NoSQL Goes Mainstream

With the rise of Web, mobile and IoT applications, use of NoSQL is becoming more popular, said Bob Wiederhold, CEO of Couchbase, provider of NoSQL database technology. "In 2016, we'll see more enterprises re-platform their data management systems using NoSQL to overcome the limits of their 30-year old legacy relational systems," he said.

While NoSQL databases are a better fit than traditional relational databases for supporting Web, mobile and IoT applications, said Ravi Mayuram, Couchbase's senior vice president of products and engineering, there is an IT skills gap for building new data management platforms.

"It is incumbent that database developers evolve their skills in order to meet these new platforms, but the technology innovators must remove much of the friction to make the transition from relational databases to NoSQL as easy as possible by extending traditional tools and languages," Mayuram said. "This will be done in 2016 in both the private sector and in academia, further fueling the growth of enterprise NoSQL deployments."

Data Storage Innovation

While spinning disks help companies scale data growth, it takes too much time to get the data off the disk, said Splice Machine's Zweben, who predicts a huge future for solid-state drive (SSD). "With SSD, there are no moving parts, much like being in memory, so the process of getting the exact data you need is extremely fast. In 2016, all new applications will use SSDs and spinning disks will become a thing of the past, he said.

As consumer demand for flash memory continues to drive down its cost, we will see more flash deployments in Big Data in the enterprise, said MapR's Schroeder. "The optimal solution will combine flash and disk to support both fast and dense configurations. In 2016, this new generation of software-based storage that enables multi-temperature solutions will proliferate so organizations will not have to choose between fast and dense; they will be able to get both."

Distributed Data Workloads

Technology cycles have swung back and forth from centralized to distributed data workloads, noted MapR's Schroeder. "Big Data solutions initially focused on centralized data lakes that reduced data duplication, simplified management and supported a variety of applications including customer 360 analysis. However, in 2016, large organizations will increasingly move to distributed processing for Big data to address the challenges of managing multiple devices, multiple data centers, multiple global use cases and changing overseas data security rules."

All about the Algorithm

As companies become increasingly interested in using data analysis to detect and mitigate cyber attacks, they will realize that effective algorithms, not the data itself, is key, said Hitesh Sheth, CEO of Vectra Networks, a provider of automated threat management software.

"To combat cyber attacks that evade perimeter security, enterprises are collecting petabytes of flow and log data in hopes of detecting attacks," Sheth said. "These systems turn into unwieldy analysis projects that typically detect an attack only after the damage is done, wasting valuable time and money. Threat detection algorithms will play a significant role in making Big Data more useful and actionable."

Ann All is the editor of Enterprise Apps Today and eSecurity Planet. She has covered business and technology for more than a decade, writing about everything from business intelligence to virtualization.

Share:
Home
Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved