That's the paradox of the false positive. When you try to find something really rare, your test's accuracy has to match the rarity of the thing you're looking for. If you're trying to point at a single pixel on your screen, a sharp pencil is a good pointer: the pencil-tip is a lot smaller (more accurate) than the pixels. But a pencil-tip is no good at pointing at a single atom in your screen. For that, you need a pointer -- a test -- that's one atom wide or less at the tip.
This is the paradox of the false positive, and here's how it applies to terrorism:
Terrorists are really rare. In a city of twenty million like New York, there might be one or two terrorists. Maybe ten of them at the outside. 10/20,000,000 = 0.00005 percent. One twenty-thousandth of a percent.
That's pretty rare all right. Now, say you've got some software that can sift through all the bank-records, or toll-pass records, or public transit records, or phone-call records in the city and catch terrorists 99 percent of the time.
In a pool of twenty million people, a 99 percent accurate test will identify two hundred thousand people as being terrorists. But only ten of them are terrorists. To catch ten bad guys, you have to haul in and investigate two hundred thousand innocent people.
Guess what? Terrorism tests aren't anywhere close to 99 percent accurate. More like 60 percent accurate. Even 40 percent accurate, sometimes.
What this all meant was that the Department of Homeland Security had set itself up to fail badly. They were trying to spot incredibly rare events -- a person is a terrorist -- with inaccurate systems.
Is it any wonder we were able to make such a mess?
#
I stepped out the front door whistling on a Tuesday morning one week into the Operation False Positive. I was rockin' out to some new music I'd downloaded from the Xnet the night before -- lots of people sent M1k3y little digital gifts to say thank you for giving them hope.