<p>Photo: Malnourished children in Dhaka, India</p>

Nutrition makes a big difference to tiny patients at a research hospital in Dhaka, Bangladesh. Admitted with diarrhea, all weighed 50 percent below the norm for their age. Two recent arrivals are still scrawny, but the baby on the right is ready to go home after three weeks of eating locally available, calorie-rich foods. Without such intervention many severely malnourished children succumb to disease.

Photograph by Karen Kasmauski

Written by Rick Weiss

Republished from the pages of National Geographic magazine

It was hot. Swamps stretched before him in every direction. And there were mosquitoes. Lots of mosquitoes. Those were the things Steven Wiersma noticed first as he stepped out of his car last summer in rural Sirmans, Florida. Everything was typical Florida—except for the one thing that Wiersma, the state's chief epidemiologist, had come to investigate.

The guts of those swarming mosquitoes, Wiersma had recently learned, were filled with a virus from a far-off land—a virus that had never inhabited the Western Hemisphere until 1999 but which had taken an instant liking to it. That virus—or its homeland, the West Nile district of Uganda—had found the United States to be a bountiful place. Plenty of birds to live and breed in. And plenty of mosquitoes to spread the virus from bird to bird—as beginning to happen with somewhat alarming regularity, from birds to people.

People like 73-year-old Seymore Carruthers of Sirmans, who lay in a coma with encephalitis in a Tallahassee hospital that week because he'd been bitten by one of those infected mosquitoes. It was just about the farthest south in the continental U.S. the virus had ventured since arriving in New York State two years earlier. And it was ominous evidence that the bug was settling into the nation's Sunbelt, home to so many mosquitoes and to so many of the nation's elderly—least capable of fighting off infection by the virus.

Looking around the neighborhood, Wiersma could easily see how Carruthers had been felled. Mosquitoes breed in stagnant water, and there was plenty of it around—a direct result of more people and more of their trappings. In addition to the area's natural swamps there were tarpaulins draped over farm and construction equipment, puckered with little pockets of rainwater. Birdbaths. Abandoned buckets in yards. Every stray puddle teemed with millions of mosquito larvae. "Where these people live, mosquito bites are part of daily life, Wiersma says. "It hasn't always mattered so much. But it's mattering more and more.'

The recent and wholly unanticipated eruption of West Nile fever in the United States has been a sobering experience for public health officials, who estimate the virus has already infected tens of thousands of Americans, sickening more than 2,000 and killing about a dozen. Far more sobering, however, is that West Nile pales in comparison with the many more ferocious infectious diseases—including those delivered intentionally by terrorists—emerging and reemerging around the globe.

Ebola is one familiar example, though that virus, it turns out, is too deadly for its own good; it kills its human victims so fast it has little opportunity to transfer from person to person and so is unlikely ever to grow into a full-fledged pandemic. But other ailments—some famous, some obscure—pose increasingly serious hazards. The mosquito-borne viruses that cause fatal dengue hemorrhagic fever and its sister disease, yellow fever—both supposedly vanquished by the 1940s—are again resident through much of South and Central America, and dengue has recently made inroads into the Caribbean and the southern United States. And with more people on the planet providing more places for mosquitoes to breed, the stage is set for a public health disaster of hemispheric proportions.

Tuberculosis has grown coldly resistant to the effects of modern antibiotics in the former Soviet Union and other regions of the world. With its ease of transmission by invisible respiratory droplets and its close association with HIV, TB is in an excellent position to wreak global havoc in the new millennium. And malaria, which already kills an estimated 1.2 million people annually—more than half of them children—has grown similarly resistant to standard medicines.

The list goes on: Rift Valley fever, hantavirus, cholera. At least 20 major maladies have reemerged in novel, more deadly, or drug-resistant forms in the past 25 years. Worldwide, scientists have discovered at least 30 previously unknown human diseases for which no cure exists, such as Marburg disease and AIDS.

That's a humbling reality given that just a couple of decades ago experts declared that many infectious diseases were on the brink of extinction. Improved sanitation, mosquito control, global vaccination, and modern antibiotics appeared to have won the war, and self-assuredness spawned complacency. Flush with our early successes against them, we concluded that microbes were no competition for our big human brains. We were wrong.

Largely unnoticed the world was changing. In developing nations, people were hacking their way into previously inaccessible areas, where a menacing menagerie of bacteria and viruses skulked about, hungry for new warm blooded hosts. Third World metropolises grew increasingly crowded, overwhelming sewage and water systems and providing a microbial mixing bowl for the creation of new diseases. Wars in nations least able to afford them spawned immense human migrations and refugee settlements with little or no sanitation or medical care. And changing patterns of temperature and rainfall allowed disease-carrying insects to extend their range.

"The world definitely favors the bugs; microbes have the advantage," says Jim Hughes, Director of the National Center for Infectious Diseases at the Centers for Disease Control and Prevention (CDC) in Atlanta. "There are a lot more of them than us. Their generation time is minutes instead of years. They evolve rapidly. And, of course, we aid and abet them in many ways—by travel, commerce in foodstuffs, transportation of animals, and our abuse and overuse of antibiotics. We're playing right into their hands.'

Plagues are not new to humanity, of course. Smallpox thrived long before the Egyptian pharaohs, and it continued to kill one-third of those it struck until, after a heroic international vaccination effort, the last human was afflicted in 1978 and the disease officially eradicated in 1980. In its final century on Earth, smallpox killed more than half a billion people.

The plague of the 14th century, known as the Black Death, wiped out about a fourth of Europe's population in just four years—a tidal wave of death almost unimaginable today. Ignorant of its cause and paranoid of the air itself, medieval society quickly descended into panic and mayhem.

Then came the discovery of the New World, offering microbes a new and deadly windfall. After Columbus first dropped anchor in the West Indies, the native population of the Americas declined drastically, largely from diseases that arrived on European vessels, many of them carrying African slaves.

And as recently as 1918-19 the great global influenza pandemic left at least 20 million dead. By comparison, World War I, fought between 1914 and 1918, claimed 8.5 million casualties.

In many respects microbes have it even easier today. With modern technology, centralized systems of food and water distribution in developed nations tend to amplify the impact of otherwise modest microbial blooms. When Lake Michigan became contaminated with the intestinal bug Cryptosporidium in 1993, hundreds of thousands of residents in Milwaukee were infected. When one U.S. fast-food chain sold undercooked hamburgers tainted with a virulent strain of E. coli bacteria in the same year, hundreds of children fell ill, and several died. And far-flung outbreaks of severe diarrheal disease have been traced to apparently healthful seed sprouts that were grown from contaminated seeds and then widely distributed from a single source. In the worst case, white radish was implicated in a Japanese E. coli outbreak that sickened about 10,000 and killed 11 in the summer of 1996.

It’s bad enough that in today's crowded and interconnected world small outbreaks can blossom inadvertently into huge epidemics. Equally worrisome, however, is the fact that terrorists can take advantage of that modern vulnerability and intentionally sow the seeds of a devastating disease.

A bioterrorism attack, as difficult to counter as almost any act of war, combines the best of microbial lethality and human ingenuity. Billions of infectious particles can be stored in a small vial, much easier to smuggle into a country than a nuclear device. Computer models have shown that an intentional outbreak of smallpox (public health officials report that some samples of the smallpox virus, stored for research after the disease was eradicated, are now unaccounted for) could spread uncontrollably almost before officials could take action to contain it. And as the U.S. learned firsthand in October, even a noncontagious disease like anthrax can wreak enormous havoc if it finds its way into the nation's mail system.

To a terrorist perhaps the most attractive feature of a plague is its fantastic capacity to create social unrest and political instability. "Infectious agents have the potential to trigger panic and fear like no other weapon," says Michael Osterholm, director of the Center of Infectious Disease Research and Policy at the University of Minnesota—and an epidemiologist with a worldwide reputation for his disease-sleuthing skills. "It's horrible to be eaten from without by a lion or something, but it's equally horrible to be eaten from the inside out by some terrible bug and to see that going on all around you. It's a very primal fear."

Bioterrorism was already a matter of heightened concern when planes crashed into the World Trade Center and the Pentagon on September 11. The CDC immediately warned U.S. public health agencies to be on the lookout for "unusual disease patterns associated with the events of September 11," a chilling hint of fear that the country might be under biological attack. A biowarfare unit from the CDC and a military team specially trained in disease detection were rushed to New York. As part of the security crackdown that followed the hijackings, federal officials temporarily grounded the nation's fleet of 3,500 crop duster airplanes, which they feared might be used to release a cloud of deadly microbes. When anthrax attacks did materialize a few weeks later, billions of dollars in resources were quickly redirected to bioterror defense.

And yet the recent emphasis on bioterrorism obscures a more pedestrian but equally important truth about infectious diseases: Even without the element of intentional terror, diseases are a huge source of human suffering—and a tremendously destabilizing force. Nearly half of the world's premature deaths (defined as deaths under the age of 45) are caused by infectious diseases. Some 30 million infants in developing countries remain unprotected by the lifesaving childhood vaccines that in the rest of the world are administered routinely; a million die each year from measles alone. It may not be obvious in the healthier nations, but from a microbe's point of view the world today—even with modern antibiotics and fancy vaccines—remains a virtual smorgasbord. With the recent reemergence of some of these diseases in richer nations, there is a growing recognition that no nation is an island.

"The lesson of West Nile is that any country is vulnerable," says David Heymann, executive director of communicable diseases at the World Health Organization in Geneva. "Countries have to realize that infectious diseases, regardless of their origins, can travel widely and affect anyone.” No nation, no matter how rich or seemingly protected, can be assured of a healthy and peaceful future as long as any nation is still an active breeding ground for the world's many and varied scourges.

Encouragingly, that reality is sinking in. A 1999 CIA report, an unclassified version of which was released in 2000, for the first time labeled global disease as a national security threat, elevating microbes to a level of political concern usually accorded nuclear warheads. Also in 2000, the United Nations Security Council convened a meeting to discuss the security threat of AIDS, the council's first meeting devoted to a health issue.

It's an important, even revolutionary, insight: Nations can enhance their own stability by taming diseases abroad. The catch is that public health improvements are difficult to implement in countries that are politically unstable or at war, as many of the world's most plague-afflicted nations are today.

The tale of Bonzali Katanga offers a tragic case in point. Katanga was the sole public health officer for the town of Durba in civil war-torn Democratic Republic of the Congo in 1998. The country's central government was a shambles, and Katanga's district was held by rebels. When men started dying by the dozens in the local gold mine, Katanga suspected Ebola or perhaps Marburg disease, caused by a similarly destructive virus. For months he tried desperately to raise the alarm, sending repeated radio messages to his superiors in the provincial capital of Kisangani while doing what little he could for his hemorrhaging patients. It took more than four months for officials to respond, and by the time they got there, Katanga was dead too. They found a vial of his blood in the refrigerator, which he had left to aid their investigation. Researchers later determined that it contained the deadly Marburg virus, which he'd contracted from the miners he had cared for until his own demise.

Katanga's death was the worst kind of proof that political instability and disease go hand in hand, an American doctor who knew him said later, shaking with anger and grief. "I consider him a casualty of war."

Ironically, Bonzali Katanga was doing exactly what global health officials say needs to be done if emerging diseases are to be controlled. He was on the ground, keeping his eyes open, and alerting authorities to anything that appeared to be infectiously amiss.

The watchword is "surveillance," and it is the linchpin in the battle against emerging diseases. It need not be complicated or hightech. When the cryptosporidiosis outbreak hit Milwaukee in 1993, it took officials many days to recognize they had a problem on their hands. The causative organism was not one they tested for routinely. And the foremost symptom of infection—severe diarrhea—was not the kind of thing people typically called their doctors about, at least not at first.

After the epidemic was brought under control, health officials conducted a retrospective study to see how they might have picked up on it sooner. The very best and earliest indication of trouble, they found, had been a vast increase in sales of over-the-counter antidiarrheal medicines—a simple sales spike that went unnoticed because no one was looking for it.

Milwaukee and other cities caught on. Now, for example, the New York City department of health has an arrangement with the Rite-Aid drugstore chain to receive weekly antidiarrheal sales data. In New Mexico, public health officials are starting to tally symptoms of people in emergency rooms and are using computers to look for groups of symptoms that might indicate the spread of a disease through the community several days before microbial culture results begin to yield clues.

But there's a place for high-tech surveillance as well. In perhaps the best example, scientists at 110 centers around the world collect samples of the influenza virus from patients each winter and conduct sophisticated genetic tests on those viruses, which mutate continually from year to year. The scientists pool this information to predict which strain will dominate in the upcoming year, and vaccine companies rush to make new batches of exactly the right vaccine just in time for the next flu season.

"It's an amazing and heroic effort that enables the pharmaceutical companies to make very effective vaccines,” says Barry Bloom, dean of the Harvard School of Public Health. "Our best protection is to know what's coming. With flu we're doing an astonishingly good job."

While vaccines are by far the most effective and cost-efficient weapons in the war against infectious diseases, precious little money is being spent on the development of new ones today, and vaccines for HIV, TB, and malaria remain elusive. Again, cooperation is required. For inspiration one need look no further than the current campaign against polio.

Thanks to an enormous international effort, polio may be eradicated as early as 2005. To accomplish that goal, countless dedicated health workers have been trekking into every village in the developing world and squeezing lifesaving drops of vaccine into the mouth of every child they can find.

The scale of the effort is almost beyond comprehension. During a series of National Immunization Days over the past several years, almost two billion children have been inoculated worldwide. Just in 2000, 550 million children—one-tenth of the world's population and almost 85 percent of its youth—received vaccinations, and in January 2001 India inoculated 152 million children in a single week.

The campaign has brought out the best in humanity, with entire wars suspended at times to allow health workers safe passage. In July 2001 the United Nations asked all warring parties in the Democratic Republic of the Congo to observe a cease-fire as part of the vaccination effort. Despite sporadic fighting and power outages, 11 million children were inoculated. And in a striking display of antimicrobial solidarity, Sierra Leone's president and his chief rebel rival posed for photographs together several years ago in the heat of that country's bloody civil war. They were mortal enemies, but they had a common cause that day, emblazoned on the T-shirts they both wore. "Kick Polio Out of Africa," the shirts declared.

There is no vaccine against West Nile virus, and none was under development while the disease was still ensconced in Africa. But now that it is racing through the United States, the U.S. government has started financing such an effort. And curiously, if it works, Americans will owe their thanks to a poor villager from Africa.

His name was Asibi. He lived in Ghana and came down with yellow fever in 1927. Scientists there who were studying yellow fever—a close viral relative of West Nile—isolated some of the yellow fever virus from Asibi's blood and cultivated it in laboratory dishes for years, first on a diet of mouse embryo cells and later on chicken embryo cells and eggs. The virus that survived this intensive dietary regimen became too weakened to cause disease, but when injected as a vaccine into healthy people, the virus stimulated the human immune system enough to protect the recipients of the vaccine against future yellow fever infections.

That so-called 17D strain of yellow fever virus has since been injected into the arms of 300 million people, shielding them from the mosquito-borne killer. Now scientists at a biotechnology firm in Massachusetts are using genetic engineering techniques to redecorate the 17D virus with a new molecular coating—one that will prime the immune system not against yellow fever but against West Nile.

The approach could be a metaphor for what it will take for humankind to stay ahead of its infectious foes. Just as microbes keep rearranging their old genomes to come up with new ways to overcome our defenses, we humans will have to draw upon everything we've ever inherited or learned—from pre-colonial jungle medicine to genetic engineering—if we are to stay ahead in the evolutionary arms race.

In some cases it will require incredibly sophisticated medicines, like some of those now in use to lessen the symptoms of HIV/AIDS. But more often than we once thought, simple solutions will work best. About 25 percent of childhood malaria deaths could be prevented tomorrow if children in affected areas simply slept under mosquito nets treated with insecticide. Cost: about five dollars a year per child.

Indeed, says Paul Ewald, a professor of biology at Amherst College, we can learn something about the power of simplicity from the microbes behind today's emerging diseases; microbes discovered long ago that evolution favors economy. It may be, for example, that some drugs and vaccines should be more nuanced than today's, taking aim at only the most virulent strains of each bug. Some culprits are bound to survive anyway, the thinking goes, and they might as well be ones we can get along with. Then the few deadly individuals that do remain—by becoming resistant to drugs or by hiding—will have to compete with the much bigger population of their less virulent peers.

"To be honest and realistic, we're going to have most of these organisms living with us long into the future in one form or another," Ewald says. "In those cases we're going to need evolutionary interventions that tip the balance of competition toward the benign strains. We need to control evolution."

It's a radical idea, controlling evolution, but one that is helping shape a new vision of the microbial future. It's a future in which truly beatable diseases will be hit with the full force of modern medicine. And it's a future in which some drugs and vaccines may actually pull a few punches: dodge and feint as their targets do.

Perhaps most important, it's a future in which people will begin to see that with just six billion of us against so many more of them, we all have a stake in even the most distant emerging microbial coup.

What's in National Geographic magazine this month? Find new articles, maps, photos, and more online.


Latest Video

See more videos »

The Genographic Project

  • Photo: DNA replica


    Learn about what's passed on from generation to generation with an interactive look at DNA.

  • 9299.jpg

    Get Involved

    Support the Research of the Genographic Project