False positives
If you have a rare occurrence, then even very sensitive and accurate tests will give you bad results because they will create many false positives. Suppose you have a test which is 99% accurate for some disease which occurs only 0.01% (one hundredth of one percent) of the time. In a population of 10,000 people you should expect one instance of the disease, but the test will flag 100 "positive" candidates. So a 99% accuracy test gives you 100 positives, but only one of them is actually ill. That's a 1% success rate.
The Chicago Boyz criticize TIA because it generates so many false positives, but don't seem to realize this is because terrorists are very rare (thank god), not because the TIA is particularly incompetent (which it may be).
They compound their error by claiming that adding more information to the patchy databases will make things worse by generating more false positives. This depends on whether the new data improves the sensitivity or not, but as the simple algebra above shows, the sensitivity of the test is not the problem, it's the low incidence of occurrence in the population.
Suppose, instead of testing everyone for disease X, we tested a subpopulation that had a higher incidence of the disease, so 10%. Testing 100 people would yield 10 instances of the disease, and you would get 1 false positive. The "99% sensitive", which was performing at just 1%, is now operating at 91%. Much better.
I have no idea how many people in the US are terrorists, but it's a tiny number. Increasing the sensitivity of any tests is good, but if you want to reduce the false positive rate you must be more selective about how you screen. Sadly, the only way I can think to do this is to racially profile in some way, which is unpopular for all the usual reasons. But the consequence is a high false positive rate unless you abandon screening altogether. All seem to be bad options.
(BTW: I would not take stories about TIA targeting anti-Bush people very seriously. Given that around 50% of the country is Democrat, and therefore dislike Bush, you would expect about half the people flagged to be anti-Bush).
The Chicago Boyz criticize TIA because it generates so many false positives, but don't seem to realize this is because terrorists are very rare (thank god), not because the TIA is particularly incompetent (which it may be).
They compound their error by claiming that adding more information to the patchy databases will make things worse by generating more false positives. This depends on whether the new data improves the sensitivity or not, but as the simple algebra above shows, the sensitivity of the test is not the problem, it's the low incidence of occurrence in the population.
Suppose, instead of testing everyone for disease X, we tested a subpopulation that had a higher incidence of the disease, so 10%. Testing 100 people would yield 10 instances of the disease, and you would get 1 false positive. The "99% sensitive", which was performing at just 1%, is now operating at 91%. Much better.
I have no idea how many people in the US are terrorists, but it's a tiny number. Increasing the sensitivity of any tests is good, but if you want to reduce the false positive rate you must be more selective about how you screen. Sadly, the only way I can think to do this is to racially profile in some way, which is unpopular for all the usual reasons. But the consequence is a high false positive rate unless you abandon screening altogether. All seem to be bad options.
(BTW: I would not take stories about TIA targeting anti-Bush people very seriously. Given that around 50% of the country is Democrat, and therefore dislike Bush, you would expect about half the people flagged to be anti-Bush).
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home