DKos's public health expert DemFromCT has a link to a Nature editorial that starts out:
Complacency, not overreaction, is the greatest danger posed by the flu pandemic. That's a message scientists would do well to help get across.
As I've read in earlier posts by DemFromCT, the key to preventing a large number of deaths from a virulent swine flu virus is to get the reinfection rate down below 1 per infected person. Then it snuffs itself. The trouble is that this is nearly impossible to do if too many people are infected. So you need to act very early in the process of the flu's spread. Judging by the number of joke threads running through twitterstreams, this is not widely understood. The public health professionals appear to be overreacting to a small number of cases.
So if the CDC and the WHO are successful in limiting the spread of the disease, they will be seen not as successful managers but as nervous nelliew. And it will be harder, next time, to implement effective measures precisely because they were so effective.
This reminded me of the Y2K computer scare. In fact, a lot of work was done, a lot of money was spent, and the crisis was averted. But the very success of the effort led to many people concluding that there really hadn't been an incipient crisis after all.
Moreover, not only did the Y2K software repairs prevent a collapse of corporate computer systems, it also forced the creation of systems of off site backups and disaster recovery. This, in turn, was partly responsible for the speed with which Wall Street was able to restart their systems following 9/11.
Robust systems that prevent disaster are hard to justify to bean-counters, and taxpayers. But that is the right way to design a system.