Business Intelligence Is Booming, Gartner Finds

by Drew Robb

From automation to data warehouse consolidation, Gartner's BI Summit was all about making better decisions. While companies are spending big on business intelligence, they aren't always getting good results, said Gartner analyst Rita Sallam.

At this week's Gartner BI Summit in Los Angeles, a packed downtown JW Marriot hotel witnessed a mix of briefings from Gartner analysts, case studies from a wide range of organizations and the latest numbers from the business intelligence market (which includes data warehouses and CRM analytics, per the Gartner system).

“The overall business intelligence market is growing at 9 percent per year and will be worth $81 billion in 2014,” said Gartner analyst Rita Sallam. “By 2020, that will rise to $136 billion.”

The Three Rs of Business Intelligence

During Gartner Vice President Bill Hostmann's keynote, he covered the three Rs of business intelligence: relevance, resources and renovation. He challenged business intelligence professionals to ask tough questions around these topics.

“Are we relevant in improving business decisions, do we have the right combination of resources, roles and responsibilities in order to succeed,” said Hostmann, “and, for renovation, how do we decide which information, analytical trends and new technology to focus on?”

He dove into each of the Rs during his hour-long talk. For example, he tasked business intelligence professionals with determining their role. If their vision of their role doesn't match up with the vision of management, their predictions and analysis are in danger of becoming irrelevant.

 “Find out what your role really is,” said Hostmann. “What kind of decisions are you supposed to help the organization make? Then make yourself relevant and get yourself an agreed=upon measure of success.”

To be relevant, he added, avoid becoming lost in a mountain of possible analyses. Focus on a few select challenges that are important to management and solve those.

Hostmann’s final point concerned teamwork -- having business intelligence staff working more closely with IT.  Maybe in the old days, either party could get away with operating in isolation. But with information no longer residing in a single database repository and so much data coming in from so many unstructured sources, business intelligence professionals and IT must partner to solve ongoing issues.

“Business intelligence is a team sport,” he said. “You have to have IT and BI working together.”

Down to One Data Warehouse

While the bulk of presentations came from Gartner sources, they were broken up by the inclusion of a series of customer sessions highlighting business intelligence in the real world. Mohammad Rifaie, vice president of Enterprise Information Management at Canada-based RBC Financial Group, explained his journey from having multiple data warehouses throughout the organization to establishing one central warehouse called Data Warehouse Enterprise (DWE).  Work began in the late '90s when information management was highly fragmented.  Back then, the company had 10 data marts.

“If that madness had continued, we’d probably have more than 30 by now,” Rifaie said. “Putting all our data warehouse functions in one group really helped.”

By 2000, the DWE was completed. To get there, RBC invested a lot of time in data standards. No database is permitted, for example, that does not have an approved data model based on the bank’s standards.

“It’s like handing your blueprints into the city to verify everything is per code,” said Rifaie.  “Standardization has paid big dividends.”

Rifaie uses a combination of ISO (International Standards Organization) standards as well as some developed by RBC. ISO contains a huge number of data standards for gender, country and other data elements. In some cases he chose to streamline the ISO data standards. For example, Rifaie believes ISO over complicates it with nine different gender codes. His bank reduced it down to just a few.

An advantage of standardization, he said, is reuse. In 2011, his group deployed more than 4,000 data elements, of which almost half had been developed previously. The morale: If you document the elements on a standard architecture, the rate of reuse rises.

RBC’s main production system contains over 145 TB of compressed data, over 57,000 users, close to 100 million queries annually. While the query rate rose a whopping 524 percent in the last six years and the user base by almost 500 percent, annual costs measured in total cost of ownership (TCO) grew just 2 percent.

“The overall cost per processing unit is 20 percent of what it used to be in 2006,” Rifaie noted. “A query costs us 16 percent of what it did before.”

The creation of this standardized data warehousing infrastructure has also enabled better business intelligence results.  At a total cost of $100,000, an application to detect fraud uncovered $75 million in fraudulent loans and mortgages in one year.

A recent update to the system has been the incorporation of unstructured data.  To carry out a project to analyze customer complaints received at contact centers, RBC initially planned to hire 20 more business intelligence specialists to conduct deep analysis. By bringing the data into the DWE platform, and bridging unstructured and structured information, the company achieved its objectives without having to hire the additional specialists.

The next big frontier, said Rifaie, is Big Data. The company has begun incorporating social media streams into the DWE picture. He gave an example of an ad campaign for a new home equity product. Immediate feedback showed that 16 percent of responses voiced concerns about hidden fees.

“By tracking social media, we were able to change our ad to emphasize that the new service was free,” he said. “The negative sentiment immediately vanished.”

Using Automation to Make Better Decisions

Making better decisions was a continuing theme throughout the conference.  While most organizations make a range of decisions, Sallam noted, there is a lack of consistency across decision makers and insight into how decisions are made. This inhibits effectiveness.

“Effective decision making at all levels of an organization separates high-performing companies from poor ones,” she said. “Decision making is so fundamental to success that improving it is the number one driver of BI and analytics.”

Despite all the money thrown at business intelligence, however, Sallam said the majority of organizations continue to lack a structure for standardizing the decision-making process. What is needed, she said, is to automate repeatable operational decisions in analytic applications to improve the quality and transparency of decisions. The process must include using the right data, analyzing only accurate data and applying it to the right problems in aligning with tactical and strategic priorities.

“Many decisions made by a line or operational worker are highly structured, repeatable and made at a high frequency with well-known decision logic,” Sallam said. “The degree of automation is high, and there is little human collaboration required apart from exception handling.”

She mentioned an emerging subcategory of analytics called intelligent decision automation (IDA), in which well-known decision rules and workflows are embedded in decision management tools such as rule engines. Obvious examples include the invoking of a rule engine to score an applicant's credit worthiness in a loan origination process, or deciding whether to authorize a loan.

“In many cases, the business policies are too complex or the possible variations too great to fully automate the decision,” Sallam said. “For such circumstances, an analytic or decision management service may still be embedded at a point of decision, but human intervention is required.”

An example: Once loan applications are approved or rejected, those falling in a gray approval area could be forwarded to a loan officer, along with supporting information to help him or her make a final ruling.

Drew Robb is a freelance writer specializing in technology and engineering. Currently living in California, he is originally from Scotland, where he received a degree in geology and geography from the University of Strathclyde. He is the author of Server Disk Management in a Windows Environment (CRC Press).

  This article was originally published on Friday Apr 6th 2012
Mobile Site | Full Site