# Mathematical Malpractice: Focus Tested Numbers

One of the things I keep encountering in news, culture and politics are numbers that appear to be pulled out of thin air. Concrete numbers, based on actual data, are dangerous enough in the wrong hands. But when data get scarce, this doesn’t seem to intimidate advocates and some social scientists. They will simply commission a “study” that produces, in essence, any number they want.

What is striking is that the numbers seem to be selected with the diligent care and skill that the methods lack.

The first time I became aware of this was with Bill Clinton. According to his critics — and I can’t find a link on this so it’s possibly apocryphal — when Bill Clinton initiated competency tests for Arkansas teachers, a massive fraction failed. He knew the union would blow their stack if the true numbers were released so he had focus groups convened to figure out what percentage of failures was expected, then had the test curved so that the results met the expectation.

As I said, I can’t find a reference for that. I seem to remember hearing it from Limbaugh, so it may be a garbled version (I can find lawsuits about race discrimination with the testing, so it’s possible a mangled version of that). But the story struck me to the point where I remember it twenty years later. And the reason it struck is because:

• It sounds like the sort of thing politicians and political activists would do.
• It would be amazingly easy to do.
• Our media are so lazy that you could probably get away with it.
• Since then, I’ve seen other numbers which I call “focus tested numbers” even tough they may not have been run by focus groups. But they cross me as numbers derived by someone coming up with the number first and then devising the methodology second. They first part is the critical one. Whatever the issue is, you have to come with a number that is plausible and alarming without being ridiculous. Then you figure out the methods to get the number.

Let’s just take an example. The first time I became aware of the work of Maggie McNeill was her thorough debunking of the claim that 200,000 underage girls are trafficked for sex in the United States. You should read that article, which comes to an estimate of about 15,000 total underage prostitutes (most which are 16 or 17) and only a few hundred to a few thousand that are trafficked in any meaningful sense of that word. That does not make the problem less important, but it does make it less panic-inducing.

But the 200,000 number jumped out at me. Here’s my very first comment on Maggie’s blog and her response:

Me: Does anyone know where the 100,000 estimate comes from? What research it’s based on?

It’s so close to 1% [of total underage girls] that I suspect it may be as simple as that. We saw a similar thing in the 1980′s when Mitch Snyder claimed (and the media mindlessly repeated) that three million Americans were homeless (5-10 times the estimates from people who’d done their homework). It turned out the entire basis of that claim was that three million was 1% of the population.

This is typical of the media. The most hysterical claim gets the most attention. If ten researchers estimates there are maybe 20,000 underage prostitutes and one big-mouth estimates there are 300,000, guess who gets a guest spot on CNN?

—–

Maggie: Honestly, I think 100,000 is just a good large number which sounds impressive and is too large for most people to really comprehend as a whole. The 300,000 figure appears to be a modification of a figure from a government report which claimed that something like 287,000 minors were “at risk” from “sexual exploitation” (though neither term was clearly defined and no study was produced to justify the wild-ass guess). It’s like that game “gossip” we played as children; 287,000 becomes 300,000, “at risk” becomes “currently involved” and “sexual exploitation” becomes “sex trafficking”. 🙁

The study claimed that 100-300,000 girls were “at risk” of exploitation but defined “at risk” so loosely that simply living near a border put someone at risk. With such methods, the authors could basically claim any number they wanted. After reading that analysis and picking my jaw up off of the floor, I wondered why anyone would do it that way.

And then it struck me: because the method wasn’t the point; the result was. Even the result wasn’t the point; the issue they wanted to advocate was. The care was not in the method: it was in the number. If they had said that there were a couple of thousand underage children in danger, people would have said, “Oh, OK. That sounds like something we can deal with using existing policies and smarter policing.” Or even worse, they might have said, “Well, why don’t we legalize sex work for adults and concentrate on saving these children?” If they had claimed a million children were in danger, people would have laughed. But claim 100-300,000? That’s enough to alarm people into action without making them laugh. It’s in the sweet spot between the “Oh, is that all?” number of a couple thousand and the “Oh, that’s bullshit” number of a million.

Another great example was the number SOPA supporters bruted about to support their vile legislation. Julian Sanchez details the mathematical malpractice here. At first, they claimed that \$250 billion was lost to piracy every year. That number — based on complete garbage — was so ridiculous they had to revise it down to \$58 billion. Again, notice how well-picked that number is. At \$250 billion, people laughed. If they had gone with a more realistic estimate — a few billion, most likely — no one would have supported such draconian legislation. But \$58 billion? That’s enough to alarm people, not enough to make them laugh and — most importantly — not enough to make the media do their damn job and check it out.

I encountered it again today. The EU is proposing to put speed limiters on cars. Their claim is this will cut traffic deaths by a third. Now, we actually do have some data on this. When the national speed limit was introduced in America, traffic fatalities initially fell about 20%, but then slowly returned to normal. They began falling again, bumped up a bit when Congress loosened the law, then leveled out in the 90’s and early 00’s after Congress completely repealed the national speed limit. The fatality rate has plunged over the last few years and is currently 40% below the 1970’s peak — without a speed limit.

That’s just raw numbers, of course. In real terms — per million vehicle miles driven — fatalities have plunged almost 75% of the last forty years, with no effect of the speed limit law. Of course, more cars contain single drivers than ever before. But even on a per capita basis, car fatalities are half of what they once were.

That’s real measurable progress. Unfortunately for the speed limiters, it’s result of improved technology and better enforcement of drunk driving laws.

So the claim that deaths from road accidents will plunge by a third because of speed limits is simply not supported by data in the United States. They might plunge as technology, better roads and laws against drunk driving spread to Eastern Europe. And I’m sure one of the reasons they are pushing for speed limits is that they can claim credit for that inevitable improvement. But a one-third decline is just not realistic.

No, I suspect that this is a focus tested number. If they claimed fatalities would plunge by half, people would laugh. If they claimed 1-2%, no one would care. But one-third? That’s in the sweet spot.