fbpx
Articles

Veteran suicide and the mission of the church

Combat, community, and COVID-19

/
February 19, 2021

You have probably heard that veteran suicide is alarmingly high. The oft-cited statistic which has become a rallying cry to end veteran suicide is that 22 veterans take their lives each day. While some have helpfully chimed in to bring context to this number, suggesting that the number is probably much lower, the reasoning behind why veteran suicide is so high has remained unchanged.

The misconception 

As it is understood, the commonly held belief for why veteran suicide is so high is typically distilled into this line of thought: 

  1. Our troops are deployed to situations wherein they see and do terrible and perhaps even horrific things in combat.
  2. Exposure to abnormal and traumatizing experiences is what brings about post-traumatic stress disorder (PTSD).
  3. PTSD is nearly impossible to cope with which eventually leads to suicide.

In short form, this line of reasoning makes sense and, for the most part, it has been the accepted narrative as to why things are the way they are. But there’s more than ample evidence that this narrative, this combat-PTSD-suicide chain, is mistaken. What’s more, if we assume suicide is mostly related to combat and PTSD, we may fail to help those most in need.

Breaking the combat-PTSD-suicide chain

In a paper published in 2015 by the Annals of Epidemiology, it was demonstrated that veteran suicide is substantially higher than their civilian counter parts. The unsettling finding of the study, however, was that among military personnel, suicide was higher among noncombat roles suggesting causes beyond combat exposure. The conclusions drawn from the study stated, “Veterans exhibit significantly higher suicide risk compared with the US general population. However, deployment to the Iraq or Afghanistan war, by itself, was not associated with the excess suicide risk.” This measured conclusion could be strengthened. If someone did not see combat, then the suicide could not have been because of PTSD derived from combat exposure. 

In 2019, The Air Force Times, likewise, published its discontent with the increase among its ranks for suicide. At the time, a mandatory stand-down was ordered for all personnel across the branch to focus on suicide prevention in this “resilience tactical pause.” Suicide for that year was significantly higher than the previous year jumping from 50 in 2018 to 78 in 2019.

This increase in deaths deserves more attention. The Air Force, though it possesses some MOS’s (military occupational specialty; one’s job) that do experience combat, is predominately non-combative in its roles; they are not a branch of the military that comes to mind with the combat-PTSD-suicide narrative. Yet disturbingly and tragically, they too, are witnessing an increase in suicide.

Finally, the past year has brought about new challenges. Suicide in the military has seen yet another wave of increases, rising 20% from the previous year. What was different from 2019 to 2020 that would significantly affect the rate of suicide? COVID-19. Many are reserved in giving an answer and avoiding labeling the correlation of COVID-19 and added stress to be the causation behind recent deaths. When looking at the pattern of evidence, though, it seems to suggest that combat-PTSD-suicide is not necessarily the dominant reason behind veteran suicide. Likewise, the increased isolation that lock-downs and prolonged quarantines have brought about are worthy of a closer look.

PTSD

PTSD itself is a bit of a quagmire. It is almost inescapably tied to the belief that only someone who experiences combat can unwittingly acquire this diagnosis. This is false. Many may see combat and never experience a single symptom. Cited in an article appearing in Task & Purpose, the Pentagon’s Inspector General put forth a report that shows sexual assault is “[M]ore likely to result in post-traumatic stress disorder than going into combat.” Combat is not a necessary link to PTSD nor the only way to experience its effects. But in the commonly held belief and discussions around veteran suicide, PTSD from combat sucks the air out of the room.

Though PTSD is a serious problem that has been connected to increased rates in suicide among veterans, there are at least two studies, one published by the Archives of General Psychiatry in 2009, and the other by the National Center for PTSD in 2017, that suggest the link is not as definitive as most believe. PTSD simply does not account for enough deaths to satisfactorily answer the unsettling questions behind why veterans are taking their lives. In light of this line of evidence, where should we be looking for why suicide is so high among veterans?

The complicated truth

A more complete answer as to why veteran suicide is so high nests more neatly under the heading of sociological factors. Stated differently, it has more to do with culture, isolation, and lack of shared experiences and values when comparing a veteran population to their civilian counterparts than combat and PTSD. Those who serve in the military are grafted into a subculture with its language, communities, duties, judicial system, boundaries and contours of honor and shame. The sum of these differences and experiences is something that is unshared by the majority of the population. The second world war had approximately 9% of the population serving directly in the military. The rest of the country, while not wearing the uniform, was still aiding in the war efforts in ways that the whole of society was oriented toward. Today, less than 1% of people serve on active duty.

Serving in the military brings about experiences that will never be shared by the majority of the nation. This lack of shared experience and values isolates and exacerbates the problems our society is already plagued with in the veteran’s personal life.

It isn’t only that sharing a relationship with a service member is now less likely, but also that our relationships look drastically different than they did a generation ago. The average Facebook user has 338 friends. Contrast this with the fact that some research indicates that 75% of people are not friends with their neighbors, 26% of people don’t know their neighbors, and social gatherings with neighbors before COVID-19 were already relatively rare. If we are desiring to find a place we need to dig deeper as to why veteran suicide is high, community disconnect is a prominent factor that demands further investigation.

We are already detached from community more than we consider. Geographically, we live in one place, work in another, shop on one side of town, go to church on the other, and pursue our weekend hobbies and recreation in someplace different than the rest. This description of our disparate lives is not an anomaly, but the norm for many. The only thing we have in common with neighbors is that we live next to them. Other than that, we are different people with different lives who rarely intersect.

Exacerbating our own problems

New York Times bestselling author Sebastian Junger struck a chord with many in his recent book, Tribe: On Homecoming and Belonging. Junger provocatively suggested that the problem of PTSD was not a matter of “what’s wrong with them,” referring to our troops, but rather, “what’s wrong with us”, referring to our culture outside of the military. While there are areas in which Junger does not fully deliver on his thesis, his impulse is correct: the issues our culture and society has are no different than what the military possesses. Serving in the military brings about experiences that will never be shared by the majority of the nation. This lack of shared experience and values isolates and exacerbates the problems our society is already plagued with in the veteran’s personal life.

For every specialty and niche interest that exists today, community options abound. But this menu list of choices has not brought people together, it’s divided, subdivided, and distanced people into communities based on hobbies, shopping preferences, media consumption, and even our places of worship. Yet we do not need more of the same, we need more of each other. Where we would once pursue relationships with those in our communities, we now seek the friendships of others through social media. Where personal friendships could serve as a kind of “general practice” for struggles with anxiety or depression, veterans are now outsourced to experts when what they need is not another visit to a therapist or a prescription refill, but authentic relationships that are abiding, meaningful, and faithfully attended to. If one believes the problem with veteran suicide is something that only a trauma specialist can address, they will disqualify themselves from any sort of help they can give through genuine friendships.

Suicide and the mission of the church

The trends of suicide in the United States reveal some alarming trajectories. Before the prolonged isolation and social restrictions that COVID-19 has added, suicides in the U.S. had increased 33% from 1999-2017. If the factors listed above are truly more decisive in suicide than combat or PTSD, then we should expect suicide to continue to increase. Moreover, veteran suicide will also continue this trend based on the expectations of our non-communal and increasingly isolated society. If this can be stated differently, veteran suicide is a sneak peak at where we are headed as a culture as a whole. If we desire to combat suicide, the place to do it is within community that seeks to disrupt isolation by loving one’s neighbor. The vehicle that is best equipped with a mission and purpose for reaching communities across our country is the church armed with the good news of Jesus Christ.

Josh Holler

Josh Holler is an associate pastor at First Baptist Church Duncan in Duncan, Oklahoma. He is a husband, father, and author of Redeeming Warriors: Veteran Suicide, Grieving, and the Fight for Faith (Christian Focus). Josh is pursuing his Ph.D. in Ethics at Midwestern Baptist Theological Seminary. Read More

Article 12: The Future of AI

We affirm that AI will continue to be developed in ways that we cannot currently imagine or understand, including AI that will far surpass many human abilities. God alone has the power to create life, and no future advancements in AI will usurp Him as the Creator of life. The church has a unique role in proclaiming human dignity for all and calling for the humane use of AI in all aspects of society.

We deny that AI will make us more or less human, or that AI will ever obtain a coequal level of worth, dignity, or value to image-bearers. Future advancements in AI will not ultimately fulfill our longings for a perfect world. While we are not able to comprehend or know the future, we do not fear what is to come because we know that God is omniscient and that nothing we create will be able to thwart His redemptive plan for creation or to supplant humanity as His image-bearers.

Genesis 1; Isaiah 42:8; Romans 1:20-21; 5:2; Ephesians 1:4-6; 2 Timothy 1:7-9; Revelation 5:9-10

Article 11: Public Policy

We affirm that the fundamental purposes of government are to protect human beings from harm, punish those who do evil, uphold civil liberties, and to commend those who do good. The public has a role in shaping and crafting policies concerning the use of AI in society, and these decisions should not be left to those who develop these technologies or to governments to set norms.

We deny that AI should be used by governments, corporations, or any entity to infringe upon God-given human rights. AI, even in a highly advanced state, should never be delegated the governing authority that has been granted by an all-sovereign God to human beings alone. 

Romans 13:1-7; Acts 10:35; 1 Peter 2:13-14

Article 10: War

We affirm that the use of AI in warfare should be governed by love of neighbor and the principles of just war. The use of AI may mitigate the loss of human life, provide greater protection of non-combatants, and inform better policymaking. Any lethal action conducted or substantially enabled by AI must employ 5 human oversight or review. All defense-related AI applications, such as underlying data and decision-making processes, must be subject to continual review by legitimate authorities. When these systems are deployed, human agents bear full moral responsibility for any actions taken by the system.

We deny that human agency or moral culpability in war can be delegated to AI. No nation or group has the right to use AI to carry out genocide, terrorism, torture, or other war crimes.

Genesis 4:10; Isaiah 1:16-17; Psalm 37:28; Matthew 5:44; 22:37-39; Romans 13:4

Article 9: Security

We affirm that AI has legitimate applications in policing, intelligence, surveillance, investigation, and other uses supporting the government’s responsibility to respect human rights, to protect and preserve human life, and to pursue justice in a flourishing society.

We deny that AI should be employed for safety and security applications in ways that seek to dehumanize, depersonalize, or harm our fellow human beings. We condemn the use of AI to suppress free expression or other basic human rights granted by God to all human beings.

Romans 13:1-7; 1 Peter 2:13-14

Article 8: Data & Privacy

We affirm that privacy and personal property are intertwined individual rights and choices that should not be violated by governments, corporations, nation-states, and other groups, even in the pursuit of the common good. While God knows all things, it is neither wise nor obligatory to have every detail of one’s life open to society.

We deny the manipulative and coercive uses of data and AI in ways that are inconsistent with the love of God and love of neighbor. Data collection practices should conform to ethical guidelines that uphold the dignity of all people. We further deny that consent, even informed consent, although requisite, is the only necessary ethical standard for the collection, manipulation, or exploitation of personal data—individually or in the aggregate. AI should not be employed in ways that distort truth through the use of generative applications. Data should not be mishandled, misused, or abused for sinful purposes to reinforce bias, strengthen the powerful, or demean the weak.

Exodus 20:15, Psalm 147:5; Isaiah 40:13-14; Matthew 10:16 Galatians 6:2; Hebrews 4:12-13; 1 John 1:7 

Article 7: Work

We affirm that work is part of God’s plan for human beings participating in the cultivation and stewardship of creation. The divine pattern is one of labor and rest in healthy proportion to each other. Our view of work should not be confined to commercial activity; it must also include the many ways that human beings serve each other through their efforts. AI can be used in ways that aid our work or allow us to make fuller use of our gifts. The church has a Spirit-empowered responsibility to help care for those who lose jobs and to encourage individuals, communities, employers, and governments to find ways to invest in the development of human beings and continue making vocational contributions to our lives together.

We deny that human worth and dignity is reducible to an individual’s economic contributions to society alone. Humanity should not use AI and other technological innovations as a reason to move toward lives of pure leisure even if greater social wealth creates such possibilities.

Genesis 1:27; 2:5; 2:15; Isaiah 65:21-24; Romans 12:6-8; Ephesians 4:11-16

Article 6: Sexuality

We affirm the goodness of God’s design for human sexuality which prescribes the sexual union to be an exclusive relationship between a man and a woman in the lifelong covenant of marriage.

We deny that the pursuit of sexual pleasure is a justification for the development or use of AI, and we condemn the objectification of humans that results from employing AI for sexual purposes. AI should not intrude upon or substitute for the biblical expression of sexuality between a husband and wife according to God’s design for human marriage.

Genesis 1:26-29; 2:18-25; Matthew 5:27-30; 1 Thess 4:3-4

Article 5: Bias

We affirm that, as a tool created by humans, AI will be inherently subject to bias and that these biases must be accounted for, minimized, or removed through continual human oversight and discretion. AI should be designed and used in such ways that treat all human beings as having equal worth and dignity. AI should be utilized as a tool to identify and eliminate bias inherent in human decision-making.

We deny that AI should be designed or used in ways that violate the fundamental principle of human dignity for all people. Neither should AI be used in ways that reinforce or further any ideology or agenda, seeking to subjugate human autonomy under the power of the state.

Micah 6:8; John 13:34; Galatians 3:28-29; 5:13-14; Philippians 2:3-4; Romans 12:10

Article 4: Medicine

We affirm that AI-related advances in medical technologies are expressions of God’s common grace through and for people created in His image and that these advances will increase our capacity to provide enhanced medical diagnostics and therapeutic interventions as we seek to care for all people. These advances should be guided by basic principles of medical ethics, including beneficence, non-maleficence, autonomy, and justice, which are all consistent with the biblical principle of loving our neighbor.

We deny that death and disease—effects of the Fall—can ultimately be eradicated apart from Jesus Christ. Utilitarian applications regarding healthcare distribution should not override the dignity of human life. Fur- 3 thermore, we reject the materialist and consequentialist worldview that understands medical applications of AI as a means of improving, changing, or completing human beings.

Matthew 5:45; John 11:25-26; 1 Corinthians 15:55-57; Galatians 6:2; Philippians 2:4

Article 3: Relationship of AI & Humanity

We affirm the use of AI to inform and aid human reasoning and moral decision-making because it is a tool that excels at processing data and making determinations, which often mimics or exceeds human ability. While AI excels in data-based computation, technology is incapable of possessing the capacity for moral agency or responsibility.

We deny that humans can or should cede our moral accountability or responsibilities to any form of AI that will ever be created. Only humanity will be judged by God on the basis of our actions and that of the tools we create. While technology can be created with a moral use in view, it is not a moral agent. Humans alone bear the responsibility for moral decision making.

Romans 2:6-8; Galatians 5:19-21; 2 Peter 1:5-8; 1 John 2:1

Article 2: AI as Technology

We affirm that the development of AI is a demonstration of the unique creative abilities of human beings. When AI is employed in accordance with God’s moral will, it is an example of man’s obedience to the divine command to steward creation and to honor Him. We believe in innovation for the glory of God, the sake of human flourishing, and the love of neighbor. While we acknowledge the reality of the Fall and its consequences on human nature and human innovation, technology can be used in society to uphold human dignity. As a part of our God-given creative nature, human beings should develop and harness technology in ways that lead to greater flourishing and the alleviation of human suffering.

We deny that the use of AI is morally neutral. It is not worthy of man’s hope, worship, or love. Since the Lord Jesus alone can atone for sin and reconcile humanity to its Creator, technology such as AI cannot fulfill humanity’s ultimate needs. We further deny the goodness and benefit of any application of AI that devalues or degrades the dignity and worth of another human being. 

Genesis 2:25; Exodus 20:3; 31:1-11; Proverbs 16:4; Matthew 22:37-40; Romans 3:23

Article 1: Image of God

We affirm that God created each human being in His image with intrinsic and equal worth, dignity, and moral agency, distinct from all creation, and that humanity’s creativity is intended to reflect God’s creative pattern.

We deny that any part of creation, including any form of technology, should ever be used to usurp or subvert the dominion and stewardship which has been entrusted solely to humanity by God; nor should technology be assigned a level of human identity, worth, dignity, or moral agency.

Genesis 1:26-28; 5:1-2; Isaiah 43:6-7; Jeremiah 1:5; John 13:34; Colossians 1:16; 3:10; Ephesians 4:24