Why Most Business Intelligence Projects Fail

by Wayne Kernochan

According to Gartner, 70 to 80 percent of business intelligence projects fail. The solution could be a more agile development process — and organization.

Recently, Gartner noted a study that purported to show that "between 70% and 80% of BI projects fail." Ordinarily, I would treat this kind of statement, from Gartner or anyone else, as the equivalent of "give me your money or you're going to be in really, really bad trouble." However, the author of the study does have recent CIO experience with business intelligence (BI). Moreover, some of the problems he cited as causing BI project failure are well-known as causes of failures in other areas of IT — in particular, bad communication between IT and corporate. So there is good reason to think that a significant proportion of today's business intelligence projects do indeed fail, by any reasonable standard.

On the other hand, 70% to 80% is an unusually high proportion of projects to fail. For one thing, business intelligence is a well-established discipline in large organizations. Moreover, much of existing BI concerns itself with enterprise reporting — canned querying applied at certain times of the year. Just how plausible is it that an extension of enterprise reporting fails 70-80% of the time, at least 15 years after BI started doing enterprise reporting in the first place? Something seems odd about this figure.

So how significant to the average user, really, is "BI project failure"? And if it is important to a particular user, is the fix really to improve CIO-CEO communications by teaching IT to talk in corporate jargon?

We Have Been Here Before

In fact, BI "projects" bear a surprising resemblance to the software development projects that enterprises have been doing for the last 45 years. Often, they involve writing new software. Certainly, they require writing new queries and using existing software in new ways that require design and planning beforehand. If the developer's expertise isn't required, then the "data miner," who knows how to translate requests for information into querying commands, is required.

Given this resemblance, we would expect the failure rate of BI projects to have some relation to the failure rate of software development projects. But when we look at the history of attempts to find out development failure rates, we discover, embarrassingly, that no one has ever really succeeded in getting a meaningful figure for failure rates.

That isn't to say that three decades of researchers haven't tried. Efforts have ranged from attempts to measure lines of working code written, to abstractions of functions coded called "function points," to comparative studies, to "bake-offs" between competing development tools. As it turns out, the only meaningful long-term measure of "failure rate" is ... what the enterprise says it is. That is, one firm that started tracking user reports of success vs. failure in the early 1990s found that, consistently, 40-60% of development projects were reported as "late," "never finished," or "didn't do what was asked."

Or at least that was true until five years ago. Because five years ago, IT, offshore and vendor developers began seriously committing to a process called "agile development." And as that happened, astonishingly, reported failure rates shrank to 20%, to 15%, and in cases where agile was used for everything, to close to 0%.

Full disclosure here: As a programmer long ago in the infancy of development, I have been a proponent of agile development ideas well before they surfaced in the Agile Development Manifesto 10 years ago. I am, indeed, proudly prejudiced about the positive effects of both agile development processes and the ideas of business agility that logically follow. But I must point out that research that I have done since makes my point. An extensive survey two years ago confirmed that by every metric I could find, IT and business, agile development brought 15-45% improvement (a recent IBM study claims 60% improvement). And by the same metrics, all other popular processes, tools, etc., either had no or minimal effects.

So the question of whether 70-80% of BI projects fail turns into: if the BI project isn't agile, why wouldn't it have high failure rates? And if it is agile, how is it credible that failure rates are that high?

Agile Business Intelligence Would Be A Good Idea

The story goes that a reporter once asked Gandhi (who as a lawyer had been in England and knew what he was talking about) what he thought of Western civilization. "I think," Gandhi replied, "that it would be a good idea."

By the same token, to ask whether BI projects are agile given that reported failure rates are 70-80% is a question with its own answer. Of course most BI projects do not use truly agile processes. There is simply no way that agile BI projects could achieve such a failure rate. Rather, this failure rate is an indication that BI is lagging far behind in its ability to achieve an agile process — and therefore, interestingly, in its ability to increase business agility. What do I think of agile BI? I think it would be a good idea.

Certainly I am not the only one to note the failure of projects that enterprises mislabel as agile. Jill Dyche, a long-time consultant, recently caused a stir by decrying projects that try to do BI "on the cheap," by applying agile programming teams uncritically, without consideration for the careful planning and communication with the business user that needs to go on (see What Agile Business Intelligence Really Means). However, for those who don't know, frequent communication between the developer and the business user is a key part of a truly agile development process. In effect, Jill, the Gartner person, and I in this particular case are saying much the same thing: there needs to be effective communication between the business with its needs and the computing arm that attempts to respond to those needs.

Wayne Kernochan of Infostructure Associates has been an IT industry analyst focused on infrastructure software for more than 20 years.

Where we part company is in just how that communication occurs. For 15 years at least, everyone has been talking about improving communications between IT and corporate; but what they really mean is communications between the top of the IT pyramid and the top of the corporate pyramid. Agile aims to do something quite different: a personal bear-hug between the actual user of the software and the developer so that both contribute to a constantly evolving improvement that ensures a product that is just as relevant to the user at "final ship" as it is when the user first sets out his or her needs. In effect, agile eliminates the failures classified as "not just what I wanted."

The really counterintuitive part of agile, though, is that it also reduces or eliminates the other two types of failed project: late, and never finished. This part of agile is counterintuitive because anyone looking at a "spiral" agile process immediately sees inefficiencies and lack of quality control. The agile process seems to say, "go this way, no, the user changed his or her mind, go this way," or "ready, fire, aim." At the same time, there seems to be an uncomfortable choice between bug fixing efforts wasted on code never to be used, and scanting on testing until the end of the process, when everything is finally nailed down.

Everyone has his or her own version of why agile still lowers lateness and inability to pass final test, and many of them are versions of the remark of the impresario in "Shakespeare in Love": it's a miracle. However, to my mind, the reason that agile actually produces better quality than quality control, and faster finishes than waterfall design-then-code-then-test programming, is that it focuses not on developing as fast as possible, but on iterating from prototype to prototype as fast as possible. In this way, the final product is built "middle in" from tested code steadily over the course of the project, user benefits from part of the product come well before the traditional "final ship," and retrofitting features towards the end of development is kept to a minimum.

The implications for business intelligence projects are straightforward. The way to go is not to improve communication between CIO and CEO, useful though that can be in other situations. The path to pursue in a BI project is to establish ad-hoc, intense ties between developers or data miners and specific users in lines of business, in middle management, or in corporate, and to use those communications links to drive rapid prototyping that goes from "wouldn't it be nice if" to "boy, have I got some great further ideas for this" in nothing flat. It passes belief that, after such a truly agile process, someone would turn around and say, "no, actually I didn't want this," or "gee, I could have gotten the same thing much faster the old way, even though I didn't even realize what I wanted back at the beginning."

Bottom Line: Why Business Intelligence Projects Really Fail

By flipping this analysis over, we can see more clearly why present projects may fail. In effect, the company's effort to generate a new way of analyzing the data suffers from the same problem as a one-way mirror: it does not examine the business' own needs and the way they are evolving, but focuses its scrutiny on flaws in the BI feature development process. Perhaps it is the fault of those darn developers; perhaps the CIO just isn't hearing what the business wants; perhaps there needs to be more careful planning by the designers up front. Don't blame you, don't blame me, blame that fellow under the tree.

Agile development is actually a lead-in to increased business agility — a highly effective lead-in, to be sure, since increased new-product development agility has the biggest positive effect on the company's bottom line. That means that reducing BI project failure by using agile processes within IT not only speeds up introduction of effective new analyses that allow the business to respond to a rapidly changing environment, but also introduces a culture of agility and agile processes to the rest of the business — the business users that it serves. As the business user finds IT business intelligence responsive to his or her needs, that executive begins to focus more on insights about the ways that the environment is changing, and to perceive that these are valuable aids to one's career and to the company. Change you, change me, don't just change that fellow under the tree. A two-way mirror means less project failure, not just in BI but all over the company.

A while back, I used a PC video camera for a briefing that had the unique characteristic that, instead of showing the other person talking to me, it showed me myself talking to the other person. It was one of the most successful briefings I can remember, because I was able to be exceptionally agile in matching my gestures and expression to what I was trying to say. Don't court business intelligence project failure by using the traditional one-way mirror or video camera to drive development. Use agile processes to show you both sides: your business' evolving needs and the ways that your tools can be changed rapidly to meet those needs. Or, if you're uncomfortable with that, just give me your money — or you're going to be in really, really bad trouble.


  This article was originally published on Monday May 2nd 2011
Mobile Site | Full Site