Saturday, January 22, 2022

Cost-Benefit Analysis

We all think it's the best way to do business. I am certain I have invoked the phrase here a couple of dozen times, and likely a hundred times in real life. Tim Harford has an interesting review of a Bent Flyvbjerg and Dirk Bester journal article that dampens my enthusiasm considerably.  Not because it is a bad idea in theory, but because as a practical reality it allows us to hide important assumptions (guilty) and to underestimate costs and overestimate benefits in ways that fall in with the general ideas around us (also guilty). Which are likely not accurate. I liked the line "The problem is not that every project engineer in the world is incapable of delivering to a reasonable budget; it is that the budgets are never reasonable." That does add up, doesn't it?  We are very used to assuming, whenever we hear an estimate on a bridge or a building or a new program that it is actually going to cost much more. Shouldn't that tell us something about initial estimates in general?

Having argued that cost-benefit analysis is “broken”, Flyvbjerg and Bester propose fixing rather than replacing it — for example by improving the accuracy of cost estimates through better data, independent audits and performance incentives. I agree. The method is open to misuse but is too valuable to abandon.

We don't have anything better. But this is so bad that perhaps we should do something.  CS Lewis once asserted ("Why I am not a Pacifist") that wars never do half the good that the belligerents promise, which I have taken as a wise caution.  Yet I think it is worse than that. From recent history of western nations, we can say that whenever war is proposed, it will cost ten times what we initially think. On the plus side for the US, the loss of our human life is likely to be one-tenth what we fear.  This inverse relationship may not be accidental.  But to go the next step, the benefit we derive will not be as advertised.  It may not be only one-tents as much, which would be lovely poetry and arithmetic, yet it is certainly much less.  If we knew wars were going to deliver only a quarter as much but cost twelve times as much, would we still go forward?  Sometimes yes.  Sometimes there is no real other choice.  But America has had the luxury of choice for most of its history.

There are analyses that show similar worse results for the War on Poverty or the War on Drugs, or any number of other endeavors, while others, such as the early space program, seem to have paid off better because of the primary research needed. So CBA, yes. Except we don't really do that now, we do an imitation to save face.

1 comment:

Christopher B said...

I'm inclined to be snarky and invoke the John McClane meme "Welcome to the party, pal!"

After accounting for hardware costs, which are usually relatively trivial unless major new installations are needed, the primary cost in most IT projects is simply the time (i.e. salary) needed to do the work. About a half a century ago, Fredrick Brooks wrote The Mythical Man-Month that recognized at that early date IT project planning and governance was abysmal, and I can pretty much guarantee from personal experience that it's not gotten any better even though the problems are well known. As recently as 2020, industry analysts Gartner Group estimated that 60 percent of IT projects fail. Other analysts have estimated as few as 20 percent are successful, and up to half require substantial reworking after completion.

From the start of the first essay in Mythical Man-Month, these should sound familiar.

First, our techniques of estimating are poorly developed. More seriously, they reflect an unvoiced assumption which is quite untrue, i.e., that all will go well.

Second, our estimating techniques fallaciously confuse effort with progress, hiding the assumption that men and months are interchangeable.

Third, because we are uncertain of our estimates, software managers often lack the courteous stubbornness of Antoine's chef.

Fourth, schedule progress is poorly monitored. Techniques proven and routine in other engineering disciplines are considered radical innovations in software engineering.

Fifth, when schedule slippage is recognized, the natural (and traditional) response is to add manpower. Like dousing a fire with gasoline, this makes matters worse, much worse. More fire requires more gasoline, and thus begins a regenerative cycle which ends in disaster.

The reference in item 3 is to the blurb on the title page "Good cooking fakes time. If you are made to wait, it is to serve you better, and to please you. MENU OF RESTAURANT ANTOINE. NEW ORLEANS." It's also interesting that he references importing practices from other project engineering disciplines in an attempt to fix things, which it hasn't, and the essay you linked appears to detail why.

I am probably being too cynical and curmudgeonly but I think Harford is wildly optimistic about the possibility of improving things more than marginally. Towards the end of his essay he references decision-making by HIPPO (highest-paid person’s opinion) as an alternative method that can be rejected. He's dead wrong as this is the *only thing that matters* in most cases even if you claim to be doing CBA. Everything else, all the spreadsheets and PowerPoints and project plans, is secondary to project initiation and continuance to convincing somebody with the authority and budget to start spending money and keeping it flowing. The cliche of IT projects being driven by an article the CEO or CTO reads in an in-flight magazine or losing a d!ck-measuring contest at some business conference is well-known and often observed by lower level IT staff. I have no illusions that similar things are not the primary driver of most non-IT projects, with high-speed rail and rail-based mass transit being two areas that I know generate many disasters, either projects never completed after substantial expenditure or with far lower ridership than projected.