Friday, April 17, 2009

Pet Peeve: Mixed Math Metaphors in News Stories

Pet peeve: News articles citing data in ways that make easy comparisons harder. They'll say, as a recent news article about birth rates among immigrants did, that 16% of one group was doing this, while "1 in 5" was doing that. So the reader has to mentally convert 1 in 5 to 20% to be able easily to compare it to the earlier 16% figure.

The reporter did this twice more in the same article -- used percentages at one end of a sentence, and some other form of proportions at the other end.

There is no useful news-information reason for doing this. It is done so commonly because writers learn not to be repetitive in the ways they express things. So saying "16% did this while more than 25% did that" seems dull to them. So they switch math metaphors midstream.

Another reason to work this way is that the reporter may have an exact-seeming figure -- 16% -- but his other piece of data isn't presented to him as "25%" but as an approximation -- "one in five" is a very general number, so comparing 16% to 25% implies a higher degree of precision in the latter number than is really there. That sounds reasonable -- but this is a bad workaround because it makes the reader work harder to do the comparison -- and too many readers are marginally innumerate as it is. A better solution would have been "16% of this, versus about 25% of that."

No, this happens far too often for any other explanation than writing style triumphant over clear, useful writing.

Another sin is using various kinds of numbers in various places in ways that, if you try to compare the numbers and do some basic math on them, they just don't make sense. I remember an article in Newsweek many years ago -- one of the then-common studies of the poor, underearning farmer in America -- where the data presented about the number of farmers and their revenues and government grants seemed to add up to a million dollars a year in earnings. If there was a reason this made them poor, the article didn't bother to justify its numbers. And when I complained in a letter to Newsweek -- this was back in the postal-letter days -- I got a snippy, unhelpful reply that made it clear the editor had no idea what I was talking about. My conclusion: Newsweek, like so many other journals, was using data simply for decorative purposes. This is why graphs showing rises and falls in something are truncated to emphasize the variance. They're not trying to show true degree of change, they're trying to dramatize it. Which is, I'm sorry, simply bad journalism.

A final related problem is data dumps that make it clear that journalists themselves include sufferers from innumeracy, or maybe math laziness. This week's Computerworld (Apr 13, 2009) has editor Frank Hayes look closer at a study released by an outfit called The Alliance to Save Energy saying if the nation's 108 million PCs (half of them corporate PCs) were shut off at nights and weekends instead of being left on (as is common in corporate environments), the country could save $2.8 billion a year in energ costs. Hayes did the math that the reporters who wrote up this study for general release obviously failed to do, or it wouldn't have gotten much press: It works out to $26 savings per PC per year -- or 50 cents a week. Or: Trivial; "less than the cost of the two minutes of a typical employee's time each week, not to mention the four or five minutes waiting for the PC to turn back on each morning.

That may still be a useful savings in your mind, when you think about it, but you don't get to think about it because the reporter of the original study didn't bother doing the math, and therefore didn't present the story in a way that would let you think about the implications. He just carried water for the Alliance to Save Energy At All Costs. Bad reporter--bad!

No comments: