Robert McNamara tracked every statistic he could, except the happiness of the Vietnamese people

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at:

When I wrote When companies make a fetish of being data driven they reward a passive aggressive style I made the point that leaders often gather up huge amounts of information, but they focus on the wrong type of information. For instance:

Google seems like an example of how a “data driven” company can go off the rails. I’m not sure what their meetings are like, but I know that in interviews the management at Google talks about their focus on data — yet their brand image continues to decline. Likewise, Marissa Mayer praised the “data driven” culture that she helped develop at Google, but when she took it to Yahoo she focused on all the wrong data — she has talked about analyzing data to learn how to boost user engagement, yet anyone who knows anything about the sub-cultures that thrived on Tumblr is also aware that Tumblr has suffered an exodus, which started around the time that Mayer gained power.

This is another good example of a leader who gathers a mountain of facts, but they are not the right facts:

During the Vietnam War Secretary of Defense Robert McNamara tracked every combat statistic he could, creating a mountain of analytics and predictions to guide the war’s strategy.

Edward Lansdale, head of special operations at the Pentagon, once looked at McNamara’s statistics and told him something was missing.

“What?” McNamara asked.

“The feelings of the Vietnamese people,” Landsdale said.

That’s not the kind of thing a statistician pays attention to. But, boy, did it matter.

I believe in prediction. I think you have to in order to get out of bed in the morning.

But prediction is hard. Either you know that or you’re in denial about it.

A lot of the reason it’s hard is because the visible stuff that happens in the world is a small fraction of the hidden stuff that goes on inside people’s heads. The former is easy to overanalyze; the latter is easy to ignore.

There is also this anecdote:

There’s the old apocryphal story that in 1967, they went to the basement of the Pentagon, when the mainframe computers took up the whole basement, and they put on the old punch cards everything you could quantify. Numbers of ships, numbers of tanks, numbers of helicopters, artillery, machine gun, ammo—everything you could quantify,” says James Willbanks, the chair of military history at U.S. Army Command and General Staff College. “They put it in the hopper and said, ‘When will we win in Vietnam?’ They went away on Friday and the thing ground away all weekend. [They] came back on Monday and there was one card in the output tray. And it said, ‘You won in 1965.’

You can consider all of these as examples of “garbage in, garbage out” but that should lead to a thoughtful discussion of how leaders can use information in intelligent ways, to guide their decision making. That is, how does one avoid “garbage in”?