Midway through the last century, a prominent scientist named F.M. Burnet stated: “If one looks around the medical scene in North America or Australia, the most important current change he sees is the rapidly diminishing importance of infectious diseases. . . . With full use of the knowledge we already possess, the effective control of every important infectious disease . . . is possible.”
On the contrary, the wholesale control of infectious diseases—once presumed to be just around the corner—has yet to materialize. To be sure, remarkable progress has been made in our understanding, prevention and management of infections in the years since Dr. Burnet’s unrealized prediction. Advances in sanitation, hygiene, nutrition, diagnostic capabilities, antimicrobial therapies and immunization programs (to name but a few) have reshaped the world we live in and saved countless lives. But the assumption that civilization will one day marshal these medical resources to once-and-for-all regain an Edenic dominion over the whole realm of pathogenic microbes appears to be more hubris than anything else. Even as medical advancements are made and old foes are put in check, new challenges continue to arise.
Take for instance the concerning trend of antibiotic resistance[1]. The advent of affordable and effective antibiotics during the previous century was a major boon and understandably led to a high confidence in the medical community. But almost as soon as widespread use of these wonderdrugs began, reports of infections caused by resistant bacteria began to emerge. The incidence and clinical significance of antibiotic resistance have risen sharply ever since.
The Centers for Disease Control and Prevention currently estimates that two million people per year in the United States have serious illnesses due to drug-resistant bacteria, with nearly 23,000 deaths annually. Global surveillance has shown a steady trend of emerging antibiotic resistance, with new types of resistance to last-resort drugs appearing and spreading predictably. A recent report of a patient in Pennsylvania with a bacterial infection harboring a specific type of resistance not previously seen in the United States—the presence of the mcr-1 gene in E. coli, which causes it to be resistant to a last-resort antibiotic called colistin—has raised further concern. Many media outlets erroneously reported that the patient’s infection was resistant to all known antibiotics, but the difficult reality remains that the discovery of this new resistance mechanism (the last puzzle piece, so to speak) means that such pan-resistant infections are right around the corner.
It’s important to understand that these ever-encroaching public health threats have not come out of left field. This is the trajectory we have been on for quite some time. The emergence of resistance is a simple matter of natural adaptation for any bacteria exposed to the selective pressure of antibiotics. Even as an antibiotic is mowing down infectious pathogens (along with a host of innocent bystanders), small populations of bacteria seek to survive and adapt. If someone were trying to kill you, you’d try to squirm out of it too. In a sense, this is all that antibiotic resistance is—bacteria figuring out how to dodge or block the bullets being sent their way.
But just because it’s a natural process doesn’t mean we’re not partly responsible for the escalating problem. Inappropriate use of antibiotics fuels the engine that drives resistance and increases the number of difficult-to-treat infections. Common examples of inappropriate antibiotic use include taking antibiotics when they are not indicated (e.g., when you have a viral illness), treating bacterial infections with antibiotics that have an unnecessarily broad spectrum of activity (using a shotgun approach when you need the sharpshooter), and the overuse of antibiotics in food-producing animals.
Healthcare providers often talk about judicious use of antibiotics, though we are admittedly better at talking about it than practicing it. Even so, an encouraging emphasis has been placed on antibiotic stewardship, which is a commitment to use antibiotics appropriately and responsibly. In addition to getting healthcare providers committed to this, good stewardship also involves educating the community so that their expectations are well informed and they’re not tempted to pressure providers into bad prescribing practices.
The concept of stewardship brings us back to Eden. While we have no cause to think we can usher our world back to a paradisiacal state free from infectious diseases, neither should we adopt a slash-and-burn mentality. As God has intended from the beginning, we are called to be good stewards of the resources he provides. He put us here and told us to care for his creation, to toil and tend it, to thrive and flourish with it. If we do not live up to this calling, there are real consequences we must deal with. Most of us can see the sense in this when it comes to environmental resources, such as a rainforest, but it applies to his gifts of common grace as well. Inasmuch as it plays a role in helping human society thrive, medicine should be seen as a gift from God, a common grace which at times can offer a temporal restraint on sickness, suffering, and even death. Appropriate and responsible use of the medical resources God has put within our reach is a matter of good stewardship. If we abuse these gifts, we threaten our own well-being.
If we think of ourselves as conquerors of God’s creation rather than stewards of it, we will tend toward irresponsible and overreaching use of the good gifts he provides. Sometimes that means felling a rainforest, thinking we can just grow more. Sometimes it means using antibiotics injudiciously, thinking we’re in control of our infectious destiny. Hubris such as this has consequences. In this particular case, I’m finding the consequences increasingly difficult to treat with my dwindling repertoire of effective antibiotics.