Annual Cost of Project Failure

Annual Cost of Project Failure

I was reading up on project failure statistics this weekend as I often need to remind my clients about the harsh realities of project management when entering into a new project. While I don’t like to scare my clients, I still hear “oh we’ll figure that out later” from stakeholders and sponsors when trying to nail scope down at the start of a project. I like to put things in proper perspective.

Many of us are familiar with the Standish Group’s Chaos Report. Indeed, it’s the source I was planning to use in my research. But this weekend I came across a relatively new white paper called “The IT Complexity Crisis: Danger and Opportunity” by Roger Sessions.

This paper is the first real attempt I’ve seen that puts solid dollar estimates behind the cost of IT project failures. I think he takes a lot of liberties to get his numbers, but the logic he uses is reasoned. When dealing with numbers of this scale it’s hard to be accurate. He asks in his white paper that we “don’t get overly focused on the exact amounts…the real point is not the exact numbers, but the magnitude of the numbers and the fact that the numbers are getting worse”.

I very much enjoyed Mr. Sessions’ paper. Over the next few posts I’ll be kicking out a few different ways of looking at his thoughts, starting from his assumptions, and working through his recommendations.

For today’s post, I took his predicted cost of annual IT failure, which is 8.9% of any country’s GDP (that’s just *huge*). I pulled worldwide GDP figures from the latest CIA World Factbook which is why my numbers differ slightly from his. The results are as follows (click for a larger view):
Annual Predicted Cost of IT Project Failure in Millions of Dollars
It’s not hard to see who the winner(?) is. Data for these maps is in the attached Excel spreadsheet, Project Failure Rates Derived From GDP.

According to these calculations, the United States wastes $1.2 trillion dollars every year on failed technology projects. I had to figure that out by counting off commas. This isn’t a one-off occurrence. If these estimates are even close to accurate, each and every year, companies in the United States take over a trillion of their hard-earned dollars and burn it. Worldwide? We’re talking $6.2 trillion.

Annual Cost of IT Project Failure in the United States

In the United States, California is the top burner, with $164 billion a year getting sucked up on IT failures, Texas consumes $108 billion a year, and New York chews up $101 billion a year.

Reblog this post [with Zemanta]
I’m a professor of project management at the college where I work. My students continually amaze me with their insights, passion and all-around awesomeness. I figure they deserve access to more answers than I can give them by myself. This site is for them.
  • Pingback: Tweets that mention Annual Cost of Project Failure » Papercut Edge --

  • Anonymous

    Don’t let a poor schedule get in the way of a successful project. Don’t ignore the gaps in your schedule. We can beat this problem!

  • boblight

    Hi Geoff,

    Some staggering numbers indeed, and good fodder for sharing a reality check with customers. I look forward to seeing how you apply your experience and creative vision to the analysis of the why of this, and some tips on how to cut the waste.

    No pressure… ;>)

  • No problem I'll just click my heels together and everything will be alright. Oh wait no, that's because of prescription medication, not me solving anything. 😀

  • PatrickRichard

    Interesting numbers but I think that they may distort the picture. Allow me to twist them another way, remembering the old saying “Lies, more lies, and statistics”…
    If I take into account the population estimate, in millions, of three states (CA, TX, and VT) for 2009, courtesy of the US Census Bureau the numbers aren’t so impressive anymore. Here goes:
    • VT
    o .6 million inhabitants
    o 2184 millions in “burn”
    o About 3640 million $ of burn per million inhabitants
    • TX
    o 24.782 million inhabitants
    o 108892 millions in “burn”
    o About 4394 million $ of burn per million inhabitants or about 1.21 times VT
    • CA
    o 36.961 million inhabitants
    o 164361 millions in “burn”
    o About 4447 million $ of burn per million inhabitants or about 1.22 times VT

    The 20% overage for large states can potentially be explained in many ways; some large states over regulate society and throw technology at everything other may be really bad a project controls (or even corrupt), etc.

    I’m not saying that there is not a problem and that it is not serious but we may want to present the information is a dispassionate way; after all 3640 million $ of burn per million inhabitants is still a large number.

    Patrick Richard ing., PMP

  • Heya Patrick thanks for the extra math! 🙂 I think you're right–the actual numbers are less important than the magnitude of the problem. Even using your more conservative estimates, after you've blown through a certain number of dollars, how many more dollars you burn doesn't really matter anymore…even the largest budget is still finite.

    The biggest issue I see with all this data is, it doesn't seem to have relevance to people. The problem and the scale of project waste has been known for a long time, but it's almost as if people look at the numbers and say “well, sure, but that doesn't happen at my company–it doesn't impact me.”

    With the next few posts, I'm mostly trying to drive the numbers home and put them in a form that people can relate to a bit better.

  • Pingback: Annual Cost of Project Failure » Papercut Edge « Rubber Tyres –> Smooth Rides()

  • Look mate, I'll be careful with my language as I might end up going to jail for using profanity. I'm amazed at the level of abstract and unscientific, fact based, discussions taking place in the context of projects failure rate. I've read the white paper showing in your link above which left me perplexed. Assuming a failure rate of 65%, based on nothing but a vague resemblance to the Standish report, whose own reliability has been strongly questioned by myself and other PM professionals, is ludicrous, unprofessional, and can't be taken seriously by anyone who seeks to based his/her decisions and comments based on substantiated data. Publishing this sort of information is pure sensationalism that does nothing for our profession.

  • I disagree. As I mentioned in the article, the exact numbers are less important than the magnitude. There is a problem, and I doubt anyone who looks at this data or any other more palatable data sets would contend that. a) I don't mind a little sensationalism if it gets people to pay attention to the problem; b) i think there are answers to be found to the problem by digging deeper into these results. I wouldn't call that nothing.

  • Ok, in that case, let's get down to the details and leave the generalities behind.

    You talk about the magnitude. The magnitude can only be substantiated based on numbers. You are suggesting that even though the 65% estimate is not necessarily correct, the actual numbers are not far from that. if this is your argument you must explain how do you know that, and provide clear and reliable references to have it substantiated.

  • I don't think so. The fact is, the real numbers being guessed at here can never be known. By anyone. Ever. If you're looking for references for this post, the formula is in Roger Sessions' white paper I've linked here–he's very clear on his assumptions. If you're looking for calculations on the post that comes after this, there's a link to my arithmetic in the body of that post. All of my references are clearly provided.

    Are they reliable? They're as reliable as they can be with the vast amounts of unknown. Do you value the logic? That's entirely up to you.

    It really comes down to this: since we can never know the actual numbers, there are two positions to take:

    1) Try.
    2) Don't bother.

  • Sorry mate, but that's exactly my point. There is something wrong about guessing a number and then using that number as the basis for a supposition. It simply does not add up and this is the sort of substance that houses of cards are built on. If you want to raise a serious argument in relation to the concept of projects failure rate you must have at least one solid component, without which the whole argument collapses.

  • I understand your point, but I'm satisfied with my sources. I believe the Standish Group and Roger Sessions have put a lot of effort into trying to understand the picture and have used the best models they have available to create it. Is it bang-on accurate? Not a chance. Is the problem it presents real? I believe it is. You're free to debunk it.

    What's more important: the picture itself or what I do with it?

    Because I believe in the picture, my message to stakeholders, sponsors and project managers will be, “value every dollar and reduce waste”. That's good sense, but people forget, overlook and disregard that every day. We've all seen it: we all know it happens. I believe this data shows the magnitude of the results of that behaviour. I also believe it's a very effective picture that says, “this is what happens when you forget that basic tenet”.

  • Pingback: IT Project Overages: Where to Look for Answers? «Papercut Edge()

  • Pingback: IT Project Value vs. Waste – Now in “Real Time”! «Papercut Edge()

  • Pingback: Unspoken Truths Behind Project Failures « Center for Business Practice Improvement()