Drones, Foreign Policy, and Christian Ethics

August 6, 2014

The use of unmanned aerial assault vehicles, or drones, to target enemy forces involves no shortage of legal and ethical questions. Their use represents a cold, calculated, and often disconnected act of taking the life of an enemy. There seems to be something different about the use of drones. Americans, and American Christians, are uneasy with an individual pilot controlling an unmanned vehicle from a remote location and attacking an enemy combatant abroad. Fortunately, the wisdom found in the Christian just war tradition speaks to these concerns. The ethical considerations of going to war and using deadly force against an enemy do not change simply because the technological platform affords an additional degree of separation.

The early church fathers, Augustine, Aquinas, and Ambrose, among others, began the ethical and theological inquiry into war. Though there is some inescapable overlap, two areas of just war exist. The first is jus ad bellum, relating to the decision to go to war and jus in bello, which guides practices in war. However, once hostilities begin, there is no point at which belligerents cross a sort of legal or ethical Rubicon whereby they must no longer consider jus ad bellum.

The jus ad bellum analysis seeks to understand when it is morally just to go to war, or participate in war. Typically, this comes down to three basic ethical considerations: just cause, proper authority, and right intention. Fighting terrorism is a just cause. First, Al Qaeda as an actor has declared open war on the United States and its civilians, and has systematically attacked the our country for decades. Second, the goals of terrorism are generally to attack both civilian and military targets in order to undermine social order. Terrorism is wanton destruction of life, indiscriminate and unrestrained.

The question of proper authority requires a two-part analysis of both the Constitution and, less importantly, international law that the U.S. seeks to abide by. The U.S. has legal authority, based upon our own Constitution, to engage in drone strikes against foreign terrorist targets. The Constitution provides some insight as to when the use of force may be used abroad. Article I, section 8, clause 11 contains the war powers clause granting Congress the power to declare war. Congress has done just that when it comes to terrorism. The Authorization for Use of Military Force for the September 11 attackers and their allies serves as the legal basis for U.S. action against terrorist groups abroad. This formal congressional act authorizes the President to direct attacks against Al Qaeda and its allies. Though it may be time to revisit the language of the AUMF, its legal authority is generally accepted as satisfying the congressional declaration of war clause.

Moreover, the international legal framework allows for the United States to defend itself against terrorist groups abroad. Internationally, the use of war and armed conflict is generally prohibited by the United Nations Charter. Signatories of the UN Treaty, of which the U.S. is one, agree to bind themselves to the document. Article 51 of the Charter prohibits the use of force but allows for the use of force by a state for the purposes of individual or collective self-defense. Some question whether or not this right extends only to massive armed aggression. They would claim that is does not apply to defense against non-state actors like terrorist groups. This standard is wrong because it would require a state under prolonged, but low intensity, attack to do nothing in defense of its citizens, borders, or property. The only logical interpretation of Article 51 must allow for states to defend themselves against various forms of attack, including terrorist groups and other non-state actors. Combined, the domestic and international legal analysis concludes that the war on terrorism meets the right authority principle.

Finally, the requirement of right intent warrants serious consideration. J. Dayrl Charles suggests that “unjust war is best illustrated by what does not constitute right intention.” Pride, blood thirst, unnecessary territorial expansion, and national aggrandizement are all examples of unjust war. Right intent would not focus on killing the enemy, but on stoppingthe enemy from doing harm. Here, it is easy to become cynical of the President’s increased use of drones and targeted killings. Some critics suggest that the increased use is merely a way to avoid the difficult questions surrounding detention, interrogation, and legal trials. However, we should not so quickly assume malintent among our military leaders. It is just as possible that the increased use of targeted killings is a more effective and decisive way to win the battle at hand.

The question of intent also demands an intentional effort to recognize the humanity of those at the other end of the drone. It is tempting to justify each strike as seeking some moral end without ever considering the human cost of war. The ability of technology and remote control of the machines of war entice those engaged in war to not consider the ethics of taking a life. The right intent principle is most difficult to determine, and has no clear answer as it relates to the increased use of drones for targeted killings. Thus, the intent of the increased use must continually be evaluated.

Jus in bello doctrine, concerning actions in war, can be distilled to three core principles: distinction, proportionality, and necessity. The distinction principle, which has been codified in various treaties and domestic laws, requires those who use force to distinguish between civilian and military targets. Civilians may not be the object of attack. In the context of targeted killing and the use of drones, pilots and commanders are required to make decisions as to who is a valid target. This is complicated in counterterrorism conflicts because of the ability of terrorist groups to act and look like civilians. Nevertheless, terrorist enemy combatants are fair targets both ethically and under the law of war because of their actions.

The necessity principle restricts the use of force to only those actions not prohibited by law which are required for the defeat of an enemy as soon as possible. It is a question of whether or not a particular action is necessary for the successful completion of the military goal. Targeted killings quickly disrupt leadership groups and training facilities. Each targeted killing or bombing campaign should be limited by the question of whether or not it is needed to complete the objective. The use of targeted killings and drone strikes have primarily focused on individuals or small groups, not villages or large compounds. This implies that the military is making efforts to limit the scope of their attacks.

Finally, jus in bello requires the use of force to be proportional. It does not prohibit the complete avoidance of civilian casualties, nor does the law require such a high standard. It is a requirement that requires actors to follow methods and tactics that avoid civilian casualties as much as possible compared to the scope of the military objective. In 2011, the United Nations reported that less than 5 percent of casualties reported with drone strikes were civilians. Again, this implies that the military is being cautious to not harm civilians. Moreover, the technological advantages of drones allow for more precision that was previously available to military commanders. Drones are, by their very nature, much more limited uses of force than a traditional aerial assault.

There is an ethical obligation to confront evil. Christians living in a democracy play are placed in a particularly difficult moral dilemma. Christians cannot choose to stand on the sidelines and make no decisions concerning the more materialistic aspects of government.

It is crucial for Christians of all stripes to reject the temptation of pacifism. Though alluring, it is merely a mirage. Guenter Lewy, historian and former member of the Jewish Brigade in Germany, argues that those who seek to avoid war may choose to “avoid the moral dilemmas posted by the world of statesmanship and statecraft . . . but they have no right to sacrifice others for this end.” In an essay titled “Learning in Wartime,” C.S. Lewis asserted his rejection of pacifism because of the numerous ethical impediments:

If I tried to become [a pacifist] I should find a very doubtful factual basis, an obscure train of reasoning, a weight of authority both human and Divine against me, and strong grounds for suspecting that my wishes had directed my decision…It may be, after all, that Pacifism is right. But it seems to me very long odds, longer odds that I would care to take with the voice of almost all humanity against me.

The Christian just war tradition rejects the quixotic idealism of the world and embraces the realism of man’s fallen nature. Blind pacifism is not an option.

A discussion of war and faith is incomplete without an analysis of the command to love thy neighbor. It is seemingly impossible to follow Christ’s command to love thy neighbor and also be willing to ethically justify going to war against a neighbor. Yet, though it may be better to turn your other cheek, it is an abdication of duty and love to turn your neighbor’s cheek for them. At times, loving thy neighbor may require reluctantly taking up arms in the effort to achieve a greater social good like peace or justice. Thus, despite the tension, just war can be an act of charity or love of neighbor if aimed at eliminating wanton murder, genocide, nationalism, or other evils. This is not to fall into a trap of moral or theological legalism, which would be a mistake. Neither do good neighbors sit idly by, watching their neighbors suffer. It is a tension, to be sure, but a tension that must be maintained. To take part in violence without properly weighing its morality is as wrong as being the idle neighbor in the face of suffering.

The just war theory empowers Christians with necessary tools to love their neighbors and seek justice. The ethical considerations do not change simply because technology advances. Instead, the application of long-standing ethical principles can be readily applied to the use of drones. Though it is important to not lose the human context of actions resulting from the stroke of a keyboard instead of the stroke of a sword, the principles remain the same. Christians living in free societies must engage in the hard work of weighing the morality of conflict as it arises. There is no abdication of the responsibility to engage in the operations of the state.

The view expressed in this commentary belongs solely to the author and is not necessarily the view of the ERLC.

Brandon James Smith

Read More by this Author

Article 12: The Future of AI

We affirm that AI will continue to be developed in ways that we cannot currently imagine or understand, including AI that will far surpass many human abilities. God alone has the power to create life, and no future advancements in AI will usurp Him as the Creator of life. The church has a unique role in proclaiming human dignity for all and calling for the humane use of AI in all aspects of society.

We deny that AI will make us more or less human, or that AI will ever obtain a coequal level of worth, dignity, or value to image-bearers. Future advancements in AI will not ultimately fulfill our longings for a perfect world. While we are not able to comprehend or know the future, we do not fear what is to come because we know that God is omniscient and that nothing we create will be able to thwart His redemptive plan for creation or to supplant humanity as His image-bearers.

Genesis 1; Isaiah 42:8; Romans 1:20-21; 5:2; Ephesians 1:4-6; 2 Timothy 1:7-9; Revelation 5:9-10

Article 11: Public Policy

We affirm that the fundamental purposes of government are to protect human beings from harm, punish those who do evil, uphold civil liberties, and to commend those who do good. The public has a role in shaping and crafting policies concerning the use of AI in society, and these decisions should not be left to those who develop these technologies or to governments to set norms.

We deny that AI should be used by governments, corporations, or any entity to infringe upon God-given human rights. AI, even in a highly advanced state, should never be delegated the governing authority that has been granted by an all-sovereign God to human beings alone. 

Romans 13:1-7; Acts 10:35; 1 Peter 2:13-14

Article 10: War

We affirm that the use of AI in warfare should be governed by love of neighbor and the principles of just war. The use of AI may mitigate the loss of human life, provide greater protection of non-combatants, and inform better policymaking. Any lethal action conducted or substantially enabled by AI must employ 5 human oversight or review. All defense-related AI applications, such as underlying data and decision-making processes, must be subject to continual review by legitimate authorities. When these systems are deployed, human agents bear full moral responsibility for any actions taken by the system.

We deny that human agency or moral culpability in war can be delegated to AI. No nation or group has the right to use AI to carry out genocide, terrorism, torture, or other war crimes.

Genesis 4:10; Isaiah 1:16-17; Psalm 37:28; Matthew 5:44; 22:37-39; Romans 13:4

Article 9: Security

We affirm that AI has legitimate applications in policing, intelligence, surveillance, investigation, and other uses supporting the government’s responsibility to respect human rights, to protect and preserve human life, and to pursue justice in a flourishing society.

We deny that AI should be employed for safety and security applications in ways that seek to dehumanize, depersonalize, or harm our fellow human beings. We condemn the use of AI to suppress free expression or other basic human rights granted by God to all human beings.

Romans 13:1-7; 1 Peter 2:13-14

Article 8: Data & Privacy

We affirm that privacy and personal property are intertwined individual rights and choices that should not be violated by governments, corporations, nation-states, and other groups, even in the pursuit of the common good. While God knows all things, it is neither wise nor obligatory to have every detail of one’s life open to society.

We deny the manipulative and coercive uses of data and AI in ways that are inconsistent with the love of God and love of neighbor. Data collection practices should conform to ethical guidelines that uphold the dignity of all people. We further deny that consent, even informed consent, although requisite, is the only necessary ethical standard for the collection, manipulation, or exploitation of personal data—individually or in the aggregate. AI should not be employed in ways that distort truth through the use of generative applications. Data should not be mishandled, misused, or abused for sinful purposes to reinforce bias, strengthen the powerful, or demean the weak.

Exodus 20:15, Psalm 147:5; Isaiah 40:13-14; Matthew 10:16 Galatians 6:2; Hebrews 4:12-13; 1 John 1:7 

Article 7: Work

We affirm that work is part of God’s plan for human beings participating in the cultivation and stewardship of creation. The divine pattern is one of labor and rest in healthy proportion to each other. Our view of work should not be confined to commercial activity; it must also include the many ways that human beings serve each other through their efforts. AI can be used in ways that aid our work or allow us to make fuller use of our gifts. The church has a Spirit-empowered responsibility to help care for those who lose jobs and to encourage individuals, communities, employers, and governments to find ways to invest in the development of human beings and continue making vocational contributions to our lives together.

We deny that human worth and dignity is reducible to an individual’s economic contributions to society alone. Humanity should not use AI and other technological innovations as a reason to move toward lives of pure leisure even if greater social wealth creates such possibilities.

Genesis 1:27; 2:5; 2:15; Isaiah 65:21-24; Romans 12:6-8; Ephesians 4:11-16

Article 6: Sexuality

We affirm the goodness of God’s design for human sexuality which prescribes the sexual union to be an exclusive relationship between a man and a woman in the lifelong covenant of marriage.

We deny that the pursuit of sexual pleasure is a justification for the development or use of AI, and we condemn the objectification of humans that results from employing AI for sexual purposes. AI should not intrude upon or substitute for the biblical expression of sexuality between a husband and wife according to God’s design for human marriage.

Genesis 1:26-29; 2:18-25; Matthew 5:27-30; 1 Thess 4:3-4

Article 5: Bias

We affirm that, as a tool created by humans, AI will be inherently subject to bias and that these biases must be accounted for, minimized, or removed through continual human oversight and discretion. AI should be designed and used in such ways that treat all human beings as having equal worth and dignity. AI should be utilized as a tool to identify and eliminate bias inherent in human decision-making.

We deny that AI should be designed or used in ways that violate the fundamental principle of human dignity for all people. Neither should AI be used in ways that reinforce or further any ideology or agenda, seeking to subjugate human autonomy under the power of the state.

Micah 6:8; John 13:34; Galatians 3:28-29; 5:13-14; Philippians 2:3-4; Romans 12:10

Article 4: Medicine

We affirm that AI-related advances in medical technologies are expressions of God’s common grace through and for people created in His image and that these advances will increase our capacity to provide enhanced medical diagnostics and therapeutic interventions as we seek to care for all people. These advances should be guided by basic principles of medical ethics, including beneficence, non-maleficence, autonomy, and justice, which are all consistent with the biblical principle of loving our neighbor.

We deny that death and disease—effects of the Fall—can ultimately be eradicated apart from Jesus Christ. Utilitarian applications regarding healthcare distribution should not override the dignity of human life. Fur- 3 thermore, we reject the materialist and consequentialist worldview that understands medical applications of AI as a means of improving, changing, or completing human beings.

Matthew 5:45; John 11:25-26; 1 Corinthians 15:55-57; Galatians 6:2; Philippians 2:4

Article 3: Relationship of AI & Humanity

We affirm the use of AI to inform and aid human reasoning and moral decision-making because it is a tool that excels at processing data and making determinations, which often mimics or exceeds human ability. While AI excels in data-based computation, technology is incapable of possessing the capacity for moral agency or responsibility.

We deny that humans can or should cede our moral accountability or responsibilities to any form of AI that will ever be created. Only humanity will be judged by God on the basis of our actions and that of the tools we create. While technology can be created with a moral use in view, it is not a moral agent. Humans alone bear the responsibility for moral decision making.

Romans 2:6-8; Galatians 5:19-21; 2 Peter 1:5-8; 1 John 2:1

Article 2: AI as Technology

We affirm that the development of AI is a demonstration of the unique creative abilities of human beings. When AI is employed in accordance with God’s moral will, it is an example of man’s obedience to the divine command to steward creation and to honor Him. We believe in innovation for the glory of God, the sake of human flourishing, and the love of neighbor. While we acknowledge the reality of the Fall and its consequences on human nature and human innovation, technology can be used in society to uphold human dignity. As a part of our God-given creative nature, human beings should develop and harness technology in ways that lead to greater flourishing and the alleviation of human suffering.

We deny that the use of AI is morally neutral. It is not worthy of man’s hope, worship, or love. Since the Lord Jesus alone can atone for sin and reconcile humanity to its Creator, technology such as AI cannot fulfill humanity’s ultimate needs. We further deny the goodness and benefit of any application of AI that devalues or degrades the dignity and worth of another human being. 

Genesis 2:25; Exodus 20:3; 31:1-11; Proverbs 16:4; Matthew 22:37-40; Romans 3:23

Article 1: Image of God

We affirm that God created each human being in His image with intrinsic and equal worth, dignity, and moral agency, distinct from all creation, and that humanity’s creativity is intended to reflect God’s creative pattern.

We deny that any part of creation, including any form of technology, should ever be used to usurp or subvert the dominion and stewardship which has been entrusted solely to humanity by God; nor should technology be assigned a level of human identity, worth, dignity, or moral agency.

Genesis 1:26-28; 5:1-2; Isaiah 43:6-7; Jeremiah 1:5; John 13:34; Colossians 1:16; 3:10; Ephesians 4:24