By / Jun 19

“Southern Baptist messengers from around the country are back home after spending two days in New Orleans for their annual meeting last week. While there, they addressed topics such as America’s immigration crisis, the controversies surrounding so-called ‘gender transitions’ – and a biblical response to artificial intelligence.”

Read the full article here.

By / Jun 14

NEW ORLEANS, La., June 14, 2023 —The Southern Baptist Convention became the first national denomination to pass a definitive statement on the ethics of artificial intelligence, which will become the cornerstone of the ERLC’s advocacy on this issue. 

Other significant resolutions were voted on and overwhelmingly affirmed by the messengers of the nation’s largest Protestant denomination during its annual meeting June 13-14 on the topics of immigration and gender transitions. 

The SBC’s Ethics & Religious Liberty Commission will remain a strong voice for dignity on issues of artificial intelligence, immigration and gender, as the resolutions supported the current positions advocated by the organization. 

Brent Leatherwood, president of the ERLC, commented below on each of the three resolutions and how they related to the ERLC’s mission to assist churches by helping them understand the moral demands of the gospel. 

On Artificial Intelligence

“Our resolutions committee deserves all the appreciation we can muster for crafting this first-of-its-kind resolution for any denomination or network of churches. Artificial Intelligence has been a hot topic, both in Washington and on the international stage. This resolution comes at an opportune time and proves once again that even when it comes to the leading edge of emerging technologies, the Bible, as always, gives us principles to guide us in uncharted waters.” 

On Wisely Engaging Immigration

“Our convention of churches has consistently called for a secure border and for immigrants to be treated with dignity. This resolution once again asserts our commitment to these twin principles that should never be pitted against one another. It rightly calls on our nation’s officials to come together and create solutions to solve our immigration crisis.” 

On Opposing ‘Gender Transitions’

“As the Baptist Faith & Message states, gender is a gift and is an essential part of the ‘goodness of God’s creation.’ It is not fluid, self-defined, or subject to the whims of a prevailing culture at odds with biological reality. This resolution rightly affirms those state governments that have taken steps to protect children from becoming pawns in the sexual revolution through harmful interventions and surgeries. At the same time it confirms the SBC will continue to be a strong voice advocating against these exploitative efforts that render far too many children and young people vulnerable.”

The ERLC has long advocated for human dignity, life, religious liberty and marriage and family. To learn more about our work and current priorities, visit erlc.com

By / May 12

Over the past year, there’s been increasing debate about the nature and classification of Large Language Models (LLMs) like ChatGPT, an artificial intelligence chatbot developed by OpenAI and released in November 2022. Are these systems truly representative of artificial intelligence (AI)? Do they propose a threat to humans? The answers, as with many things in the complex world of technology, are not as straightforward as they might seem.

What is a Large Language Model?

A LLM is a type of computer program that’s been trained to understand and generate human-like text. It’s a product of a field in computer science called AI, specifically a subfield known as natural language processing (NLP). Chat-GPT (which includes a couple of variations, such as GPT-3, GPT-3.5, and GPT-4) is currently the most popular and widely used LLM.

If you’ve ever started typing a text message on your smartphone, and it suggests the next word you might want to use (predictive text) or suggests a spelling (autocorrect), you’ve used a basic form of a language model. LLMs apply that concept on a larger and more complex scale.

An LLM has been trained on a broad and diverse range of internet text. It then uses a machine learning process, including advanced statistical analysis, to identify patterns in the data and uses that information to generate responses for a human user. The training sets are also incredibly massive. The older, free version of Chat-GPT (GPT-3.5) was trained on the equivalent of over 292 million pages of documents, or 499 billion words. It uses 175 billion parameters (points of connection between input and output layers in neural networks).

When you interact with a large language model, you can input a piece of text, like a question or a statement (known as a “prompt”), and the model will generate a relevant response based on what it has learned during its training. For example, you can ask it to write essays, summarize long documents, translate languages, or even write poetry.

The output produced by such models can often be astoundingly impressive. But LLMs can also produce “hallucinations,” a term for generated content that is nonsensical or unfaithful to the provided source content. LLMs do not have an understanding of text like humans do and can sometimes make mistakes or produce outputs that range from erroneous to downright bizarre. LLMs also don’t have beliefs, opinions, or consciousness—they merely generate responses based on patterns they’ve learned from the data they were trained on.

In short, an LLM is a sophisticated tool that can help with tasks involving text, from answering questions to generating written content.

Are LLMs truly AI?

Before considering whether LLMs qualify as AI, we need to define how the term AI is being used. In broad terms, AI refers to the simulation of human intelligence processes by machines, especially computer systems. These processes include learning, reasoning, problem-solving, perception, and the ability to use human languages. The key term is simulation. AI’s do not have consciousness, so they cannot perform such rational functions as thinking or understanding, or possess such attributes as emotions and empathy.

In the strictest sense, LLMs like GPT-3 fall under the umbrella of AI, specifically the subgroup known as generative AI. LLMs learn from large datasets, recognize patterns in human language, and generate text that mirrors human-like understanding. However, there’s a distinction to be made between what is often referred to as “narrow AI” and “general AI.”

Narrow AI systems, also known as weak AI, are designed to perform a specific task, like language translation or image recognition. Although they may seem intelligent, their functionality is limited to the tasks they’ve been programmed to do. Chat-GPT and similar LLMs fall into this category.

In contrast, general AI, also referred to as strong AI, represents systems that possess the ability to understand, learn, adapt, and implement knowledge across a broad range of tasks, much like a human being. This level of AI, which would essentially mirror human cognitive abilities, has not yet been achieved. Some Christians believe that AI will never reach ​that level because God has not given man the power to replicate human consciousness or reasoning abilities in machines.

While LLMs are a form of AI, they don’t possess a human-like understanding or consciousness. They don’t form beliefs, have desires, or understand the text they generate. They analyze input and predict an appropriate output based on patterns they’ve learned during training.

Are LLMs a threat?

LLMs are a category of tools (i.e., devices used to perform a task or carry out a particular function). Like almost all tools, they can and will be used by humans in ways that are both positive and negative. 

Many of the concerns about AI are misdirected, since they are fears based on “general AI.”  This type of concern is reflected in science fiction depictions of AI, where machines gain sentience and turn against humanity. However, current AI technology is nowhere near achieving anything remotely reflecting sentience or true consciousness. LLMs are also not likely to be a threat in the way that autonomous weapons systems can be. 

This is not to say that LLMs do not pose a danger; they do in ways that are similar to social media and other ​​internet ​​related functions. Some examples are:

Deepfakes: Generative AI can create very realistic fake images or videos, known as deepfakes. These could be used to spread misinformation, defame individuals, or impersonate public figures for malicious intent.

Phishing attacks: Phishing is the fraudulent practice of sending emails or other messages purporting to be from reputable companies in order to induce individuals to reveal personal information such as passwords and credit card numbers. AI can generate highly personalized phishing emails that are much more convincing than traditional ones, potentially leading to an increase in successful cyber attacks.

Disinformation campaigns: AI could be used to generate and spread false news stories or misleading information on social media to manipulate public opinion.

Identity theft: In 2021 alone, 1,434,698 Americans reported identity theft, with 21% of the victims reporting they have lost more than $20,000 to such fraud .AI could be used to generate convincing fake identities for fraudulent purposes.

While there are also many positive uses for generative AI, ongoing work in AI ethics and policy is needed to limit and prevent such malicious uses.

As the ERLC’s Jason Thacker says, a Christian philosophy of technology is wholly unique in that it recognizes 1) that God has given humanity certain creative gifts and the ability to use tools, and 2) and that how we use these tools forms and shapes us. “Technology then is not good or bad, nor is it neutral,” says Thacker. “Technology, specifically AI, is shaping how we view God, ourselves, and the world around us in profound and distinct ways.”

 See also: Why we (still) need a statement of principles for AI

By / May 8

In April 2019, a group of over 70 evangelical leaders signed and launched Artificial Intelligence: An Evangelical Statement of Principles (Spanish version) with two goals in mind. First, we wanted to help the Church proactively think about the myriad of ways that AI is shaping our society and provide a sound theological, philosophical, and ethical framework with which to wisely navigate these tools. Second, we sought to present a distinctly Christian view on the fundamental questions being raised amid the social and political ramifications of the expanding development and application of AI. 

One of the fascinating aspects of the current cultural conversation on AI is how quickly people have become entranced by these technologies, especially after the launch of ChatGPT and other generative AI tools. While many are excited about the advances these tools may bring, many are incredibly disturbed by their dangers and risks. Debates over the future of AI have centered on the reality that these tools are doing things once reserved solely for human beings, leading many to ask the age-old question: What does it mean to be human? 

Being human in an age of machines

The perennial question of what it means to be human becomes even more important in this age of emerging technologies. In the statement, we addressed it by affirming the unique nature of humanity and denying “that any part of creation, including any form of technology, should ever be used to usurp or subvert the dominion and stewardship which has been entrusted solely to humanity by God; nor should technology be assigned a level of human identity, worth, dignity, or moral agency.” This is rooted in the Christian understanding of how God bestowed a unique status on humanity with the imago Dei

Many are amazed and fearful of advanced AI systems as they fundamentally challenge much of what we have assumed about the uniqueness of humanity. For generations, we have assumed that what it meant to be human was the ability to think, create, and to perform certain complex tasks. An attribute-based view of humanity and the imago Dei is prevalent throughout much of human history. While it is true that humanity does seem to model certain features such an reason/rationality (substantive), gregariousness (relational), and representation (functional), do these attributes or capacities ontologically ground human identity, or do they better represent a fundamental status that human beings have in light of how God set us apart from the rest of creation as those made is his image?

As German Catholic philosopher Robert Spaemann notes in Persons, “human beings have certain definite properties that license us to call them ‘persons’; but it is not the properties we call persons, but the human being who possess the properties.” A person in Spaemann’s framework is someone vs. something, thus, regardless of one’s capacities or attributes, they are a person by simply being a member of the human species. He writes that “there can, and must, be one criterion for personality, and one only; that is biological membership of the human race.”

While human beings are a specific kind of creature who might exhibit certain characteristics and attributes, human dignity isn’t based on the presence of those particular attributes or capacities. While much more can and should be said, this truth must be central to the ongoing debates over the development and use of AI today.

The future of AI

A related and second question rising above the fray today centers on where these technologies are headed in terms of their role in our society and how we are to view them as they grow in their imitation of certain human capacities. Much of the popular discussion surrounding AI notes the seemingly unstoppable nature of these tools and how they will soon rival (or even overtake) humanity’s place in society. In the 2019 statement, we noted that “AI will continue to be developed in ways that we cannot currently imagine or understand, including AI that will far surpass many human abilities. God alone has the power to create life, and no future advancements in AI will usurp Him as the Creator of life.”

The future of AI is an open question of sorts, but Christians must recognize that there are certain inherent limitations to these technologies. Indeed, much of today’s doomsday mentality is rooted in a view of technology at odds with the theological, philosophical, and ethical framework provided in Scripture. In contrast to the two most common views, a biblical framework recognizes that technology is neither autonomously deterministic nor is it simply a neutral instrument that we simply use. 

As I wrote in The Digital Public Square: Christian Ethics in a Technological Society, a Christian philosophy of technology is wholly unique in that it recognizes 1) that God has given humanity certain creative gifts and the ability to use tools, and 2) and that how we use these tools forms and shapes us. Technology then is not good or bad, nor is it neutral. Technology, specifically AI, is shaping how we view God, ourselves, and the world around us in profound and distinct ways.

While we rightly debate how to mitigate the risks and promote the good of technological advances, the Church must not give into the moral panic induced by AI, nor should we passively allow others to shape the conversation in ways that are directly at odds with the Christian tradition. As Carl F. H. Henry wisely noted, the center of the Chrisitan ethic is the concept of love which is modeled in the Great Commandment given to us by Christ (Matt. 22:37-39). The Church must see the love of God and love of neighbor, manifested in recognizing the dignity of all, as central to the ongoing work related to AI and its role in our society. 

The 2019 statement of principles was designed to jumpstart the conversation about AI in the Church, which is needed now more than ever. As the Church engages these questions, we must remember that the Christian moral tradition recognizes that no matter how advanced our technologies become, there is nothing that can fundamentally change what it means to be made in the image of the almighty God (Gen. 1:26-28). Embracing this truth today means retrieving a robust view of what it means to be human in an age of machines.

By / Feb 13

Since 2020, I have sought to write about some of the top technology issues to be aware of and how we as Christians can address them in light of the Christian moral tradition rooted in the love of God and love of neighbor. There have been a couple prevailing themes over the years centered on the ways that technology is shaping us as people—namely our understanding of ourselves and those around us—and how we as a society are to think through the power it holds in our lives.

Already, it seems 2023 is going to be an interesting year as we deal with an onslaught of emerging technologies like advanced AI systems and virtual reality, as well as continue to navigate pressing challenges of digital privacy and the role of faith in the digital public square.

Artificial Intelligence and ChatGPT

Back in 2020, there was already social buzz about AI and how it was shaping our society. I published my first book, The Age of AI: Artificial Intelligence and the Future of Humanity, with the goal of thinking about how some of these technologies might affect our understanding of human dignity and our common life together. As 2022 came to a close, there was a major release of a ChatGPT (chatbot) from OpenAI that caught the attention of our wider culture and confirmed that AI is (and will continue to be) a major part of our lives, especially in education and business. 

The introduction of advanced AI systems like these in recent years has fundamentally challenged much of what we have assumed about the uniqueness of humanity. These systems are now performing tasks that only humans could in past generations.

In an age like ours, we all need to be reminded that the value and dignity of humans isn’t rooted in what we do but who we are as those uniquely made in the image of our Creator.

AI systems like ChatGPT have deeply concerning elements but also afford the opportunity for educators and students to evaluate with fresh eyes the purpose and design of education. Education is not simply about information transfer but whole-person transformation. These types of tools require that administrators, professors, and students alike learn about how these systems work, their advantages and limitations, and how we might seek to prioritize the transformation of their students above a simple letter grade. 

Similar to the classroom, these tools may have limited use in local church ministry but must be thought through with the utmost care and wisdom. They may be used to aid one in research, writing reflection questions, or even rudimentary copy for church functions. However, one must keep in mind their limitations as well as be on guard for the temptation to simply pass off the output as their own work. 

Current limitations with these systems are myriad and must be taken into account as one thinks through the ethical ramifications of their use. They are limited by data sets and human supervision used in training the system; are widely known to falsify information, misapply concepts, or even alter their answers based on the political and social views of their creators; and rarely account for nuance and complexity, leading to, at best, the production of entry level and/or basic material. 

Privacy rights and children

A second issue we should be aware of is one that will inevitably be perennial. With the ubiquity of technology and our growing dependence on it, there is the vast and growing concern over personal privacy and how this data will be used by individuals, governments, and especially the technology industry.

We live in a data-saturated world, and there can be a lot of money made by harvesting troves of data and creating predictive products or optimizing our interactions with our daily technology use. Governments around the world are beginning to or have already regulated the flow of data and who has access to it, often focusing on a supposed right to privacy—a term that has competing definitions and proposed safeguards.

Christians, specifically, need to think deeply about what a right to privacy is and what it is not

In 2023, four American states—Colorado, Connecticut, Utah, and Virginia—will follow a similar pattern to California’s groundbreaking privacy law, the California Consumer Privacy Act (CCPA) which went into effect in January of 2020, and begin implementing new General Data Protection Regulation (GDPR) on data collection and use. These new state laws share many of the same types of protections as the CCPA and GDPR of the European Union. 

This year, there will be increasing pressure across the board for federal legislation focused on privacy as it specifically relates to children, as is seen with the bipartisan Kids Online Safety Act (KOSA) and more broadly with other proposals. 

Regardless of where these policies end up, the framing of privacy soley in terms of moral autonomy and personal consent often makes it easy to overlook data privacy as such a central concern to Christian ethics.

Instead, Christians need to be the ones asking the hard questions about how we as a society want to protect and guard the rights of the individual but in ways that also promote the common good.

Virtual reality and augmented reality

One of the technologies that was discussed significantly in 2022 and will likely continue to be in 2023 is virtual reality (VR) and augmented reality (AR). In recent years, we have seen a surge of new VR devices and an increasing number of wearable devices such as smart glasses. As the devices become more commonplace in our society and societal norms continue to shift, it seems likely that they will grow in prominence in our lives.

Some of the pressing ethical questions about their use are not as straightforward as ethical issues in technology, but a host of new challenges will arise, especially in light of the new mediums and means of connection that VR has created. 

Aside from the more common concerns of data privacy, including the use of advanced biometric data such as eye-tracking and more, there are also novel challenges to long-standing understandings of free speech and religious freedom in these digital spaces. These developments are often spoken of in terms of the wisdom of VR churches and gatherings. However, I think the more pressing questions will be over how religious groups who may hold to culturally controversial beliefs—especially on topics like sexuality, gender—will be treated in these digital environments. 

These spaces are not truly public because they are often hosted or even created by technology companies themselves. This represents a new angle on the continued debate over free speech, content moderation, and the nature of faith in the public square.

Overall, 2023 will be a year where Christians are continually pressed to think about how we will live out our faith in the public square amid an increasingly secular culture. One of the temptations when faced with complex or challenging ethical questions with technology is to rush to a position of full adoption or rejection. 

Wisdom, which is at the core of the Christian moral tradition, calls us to slow down and think deeply about the nature of these tools, and discern if their many uses can help us better love God and love our neighbors as ourselves.

By / Aug 17

Since 2017, the Chinese Communist Party (CCP) has been persecuting Uyghur Muslims, a predominantly Turkic-speaking ethnic group, in a systemic campaign of oppression and persecution. The geographic scope of the CCP’s campaign against Uyghurs is global, but primarily restricted to Xinjiang, China’s western-most territory, where Uyghurs have lived for centuries. Under the guise of national security, the CCP is seeking to “pacify” the region with totalitarian tactics like pervasive surveillance, thought control, ideological reeducation, forced birth control, and compulsory labor. Life for many Uyghurs is a living nightmare. 

Surveillance state of the Chinese Communist Party

For Uyghurs living in Xinjiang, there is no such thing as a private life. The Chinese government has built a pervasive surveillance apparatus that not only records the movements of Uyghurs, but also tracks normal, routine actions. Something as innocent as entering one’s house through the back door, not socializing with neighbors, using WhatsApp, or changing phone numbers could trigger suspicion from China’s highly developed artificial intelligence algorithms

These algorithms intrude into the most sensitive and personal facets of the lives of Uyghurs, tracking their phones, cars, reproductive choices, and political views. The CCP often justifies its detention of Uyghurs on the grounds that they are engaged in extremist or terrorist activity, but the scope of China’s high-tech surveillance far outstrips the problem, resulting in arbitrary intimidation and arrests.

Reeducation camps for Uyghur people

The surveillance networks throughout Xinjiang flag “suspicious” Uyghurs for CCP authorities. Once Chinese police detain a Uyghur for questioning, they are often sent away for “political reeducation.” China has constructed upward of 1,000 internment camps for this purpose. Estimates vary, but experts posit that China has detained between 1 million and 3 million Muslims in these facilities. Aside from political indoctrination, physical and psychological abuse is commonplace throughout these camps, ranging from rape and torture to malnourishment and forced organ harvesting

The CCP also uses these camps to break apart Uyghur families. In cases where Uyghur husbands are sent off to camps, China has sent ethnically Han men to forcibly procreate with the wives who are left behind. In some cases, where both the mother and father are detained, the CCP has sent Uyghur children to government-run boarding schools where all communication with the outside world is strictly regulated.

Forced labor by the Chinese Communist Party

The CCP’s oppression of Uyghur Muslims does not stop at the reeducation. Beginning in 2018, reports began to emerge chronicling how China is exploits this group vocationally. China is the world’s largest cotton producer, and the vast majority of those exports come from Xinjiang. For many Uyghurs, the reeducation camps are a launching pad to compulsory labor in this industry. Whether in Xinjiang or throughout China, the CCP is relocating Uyghurs and exploiting them for free or underpaid labor. 

Because of China’s significant cotton exports, companies that operate in Xinjiang or purchase cotton or clothing from China run the risk of financially supporting the oppression of the Uyghur people. A March 2020 report entitled “Uyghurs for Sale” looks at the supply chains of over 80 international brands in the technology, clothing and automotive sectors and documents how Uyghur workers have been compelled to work in factories that are connected to the supply chains of those brands.

Forced sterilization of Uyghur women

China has a long history of imposing restrictive family planning on its citizens, and for years strictly enforced the infamous “one-child policy.” The restrictive birth policy has created a stark gender imbalance, and Chinese men today don’t have enough women to marry, resulting in the trafficking of brides and a larger sex trafficking industry. At the end of 2015, the Chinese government loosened its policies, allowing couples to legally conceive two children, and have encouraged Han Chinese to do so.

But while China has relaxed its family planning policy toward Han Chinese, the CCP has severely oppressed Uyghur women with draconian birth control measures. Uyghur women are subjected to forced pregnancy checks, medication that stops their menstrual period, forced abortions, and surgical sterilizations. 

One of the major reasons that Uyghur women are sent to the internment camps is for having too many children. China’s goal, it seems, is to eradicate future generations of Uyghurs by manipulating who can and can’t bear babies, and how many children a family can legally conceive.

How has the U.S. government responded?

In July 2020, Secretary of State Mike Pompeo and Treasury Secretary Steve Mnuchin announced that the United States government would apply “Global Magnitsky Sanctions” to top-ranking Chinese officials and a Chinese government entity for their roles in human rights abuses and religious freedom violations against the Uyghurs in Xinjiang. The Global Magnitsky Human Rights Accountability Act, passed by Congress in 2016, authorizes the executive branch to impose visa bans and other restrictions on any foreign person or entity “responsible for extrajudicial killings, torture, or other gross violations of internationally recognized human rights committed against individuals in any foreign country seeking to expose illegal activity carried out by government officials, or to obtain, exercise, or promote human rights and freedoms.” 

In addition to administrative action, Congress has passed several important pieces of legislation to counter China morally. Recently, the Uyghur Human Rights Policy Act was signed into law. The legislation imposes sanctions on foreign individuals and entities responsible for human rights abuses in China’s Xinjiang Uyghur Autonomous region and requires various reports on the topic.

Congress has likewise introduced the bipartisan, bicameral Uyghur Forced Labor Prevention Act. This important bill would prohibit goods made with forced labor in Xinjiang or by entities using Uyghur labor forcibly transferred from Xinjian from entering the U.S. market. This legislation also instructs the U.S. government to impose sanctions against any foreign person who knowingly employs or utilizes the forced labor of Uyghurs and other Muslim minority groups in Xinjiang.

What can you do to help?

Speak up

Each one of us can use our voice to speak up on behalf of those who can’t speak up for themselves. You can share articles on the persecution of Uyghurs on social media. You can invite a Uyghur to share their story through Zoom to your community. You can urge the U.S. government to continue taking strong measures to address these injustices. Below are some educational resources to continue educating yourself and share with others.

Pray 

We ought to pray often for persecuted people around the world. Below are a few specific ways to pray.

  • Pray for the leaders of China, that they will end their oppression and persecution of their citizens, especially Christians, Uyghurs, and other ethnic and religious minorities. 
  • Pray for Christians in China, that they will be bold in proclaiming the good news of the gospel, and that they will stand up for those who are being persecuted.
  • Pray for world leaders, that they will have the courage and wisdom to counter China morally and hold the CCP accountable for their gross violations of human rights.

Christians should be on the frontlines of advocating for the dignity and human rights of all people. We cannot remain silent or complacent in the face of such injustices.

The Chinese Communist Party (CCP) routinely violates the basic human rights of the Chinese people. Their decades of abuse are well documented, including systematically monitoring and destroying Christian churches.

By / Aug 3

It’s easy for anyone to get caught up in the hype surrounding new technologies. A new innovation often debuts with some helpful benefits and great new features, all of which wow us and lead us to believe that we are on the cusp of something truly revolutionary. Promises are made, and there are countless predictions about what is to come next. But soon after the press conferences fade and the hype dies down, we see these innovations for what they really are—helpful tools with innovative benefits that often do not live up to the hype surrounding their release but also reveal a number of potential misuses, abuses, or failures that we did not account for. Part of this is because we grow accustomed to innovation. But it also happens because we put a level of hope and desire on these technologies to usher in a new era of our world.

OpenAI recently announced their new language model called GPT-3, which is one of the most advanced AI systems in the world to date. This system is truly amazing. It is able to write prose, design and code basic HTML including various mini applications, and even engage in “deep” philosophical conversations about the nature of God and the universe with you. OpenAI released the technical documentation back in May. And according to Morning Brew, GPT-3 “has 175 billion parameters, a 117x increase over its predecessor’s 1.5 billion.” The system was trained on roughly a trillion words. In layman’s terms, it is pretty powerful. The company decided to allow a small group of select users to test out the system, and many shared their experiments online to show off the power of the new system. 

But nearly as soon as people were seeing the immense potential of the system, there came a wave of excitement about what this step forward in AI might mean. “Playing with GPT-3 feels like seeing the future,” a San Francisco-based developer tweeted about this tool. Some even questioned if we were that much closer to human level AI, also known as artificial general intelligence (AGI). However, the dream of a future AGI system is highly debated in computer science circles as well as in philosophy and religion.

Human level intelligence?

This isn’t the first time we have had such utopian dreams with AI. Some of the talk surrounding GPT-3 reminds me of the debut of Google Duplex back in 2018 at their annual developer conference, where Duplex was shown to book a haircut at your local salon or even a table at a restaurant all on its own. There have been countless seasons of grand visions for AI and where we are headed as a society, which ultimately died down over time as we adjusted to our expectations and saw these innovations as encouraging advances but ultimately not as life-altering as promised.

The reality about GPT-3 is that the model is extremely powerful and honestly a good bit of fun based on the users who have been working with it, but this system is no closer to ushering in the famed golden age of AI than any other innovation. This is simply because our current level of AI, known as narrow AI, while powerful and beneficial, is nowhere close to actually understanding the results or products it delivers, nor will algorithmic technology actually be able to achieve the level of general or human-like intelligence. This is because human beings are not machines, even if we often treat each other as mere objects to be manipulated and altered at will. Our minds and consciousness are not simply the result of some chemical reaction or organic algorithm, a view that has been popularized by many thinkers such as Yuval Noah Harari or Ray Kurzweil.

The depth of the human experience

This point has been highlighted by many prominent thought leaders over the years, such as renowned mathematics professor John Lennox of Oxford, the late philosopher Roger Scruton, and many computer scientists like Rosalind Pichard and Joanna Ng. This summer, I spent some time digging into Scruton’s works On Human Nature and The Soul of the World, in which he shows how a naturalistic understanding of the world fails to account for the depth of humanity in terms of our conscious experiences, emotions, moral agency, and even how we see each other as unique beings in this world. This reductionistic view of humanity is often behind the pursuit of the famed rise of AGI, because if there is nothing unique about humanity, then we should be able to recreate human intelligence and experience in a digital form.

Scruton describes one aspect of the uniqueness of humanity as the presence of subjective experiences, or the I/You paradigm, as one of the main differences between how subjects (like you and I) operate in a world of objects (like that of technology). This is one of the reasons we question the nature of ethics, our identity, and even the presence of God himself. Even the animal kingdom doesn’t experience the world as humanity does. We were created unique by God himself as his image bearers (Gen. 1:12-28).

There is a common misconception that our personhood can be derived simply from the material, which leads humanity down dangerous paths of believing that we are less valuable than we really are and overvaluing technology as if it somehow has the potential to become our equal or even surpass us in terms of utility or dignity.

Many may have missed how quickly people acknowledged that this GPT-3 model did not actually exhibit any of the signs of actual human level intelligence, even if the system could do things that were previously unbelievable for an AI system to do. But this longing for a system to create AGI reveals something a bit ironic about our longings and desires as humans that we shouldn’t miss. We often seek to humanize our creations, i.e., technologies, all the while dehumanizing ourselves. With our desire to be like God and create something in our image, we end up having to dumb ourselves down and treat ourselves as if we are merely machines rather than uniquely created image-bearers of the living God.

While many will continue to claim that faith and science are simply at odds with one another and that AGI development is just around the corner, Christians can remember and have hope that even with our wildest attempts or innovations, we are simply not able to change our own human nature nor create something like ourselves. As amazing new technological innovations continue to rise, we can step back and praise God for the incredible, talented people creating these tools rather than focus on some desire to create something on par or even greater than ourselves.

These innovations can be used for immense good, but we also must remember that they will be misused and possibly even become objects that we put our hopes in instead of God himself. We may trick ourselves into believing that it is possible to reach AGI or even create an AI system that can pass the famed Turing test, but we simply are not able to define, alter, or manipulate our humanity and personhood to feed these longings. We are God’s creatures and must never forget how we are called to live in this world—always recognizing our creatureliness and fixing our gaze on the Creator of all life and everything in the cosmos.

By / Jul 9

Jason Thacker: As churches haven’t been able to gather in many months and as some begin to gather again under various restrictions, what kinds of things do you think we miss about the gathered church that technology cannot replicate or replace?

Jay Kim: Embodied presence. Almost everyone I talk to expresses the same sadness and longing—that all of the digital online mediums at our disposal are helpful but ultimately unsatisfactory. Several months into sheltering-in-place now, as digital fatigue sets in, I think what we miss most is the ability to be near one another as we worship and commune—hearing voices sing together, listening, learning, leaning in together as we hear the Word preached, the shuffling of feet and the extending of hands as we take the bread and the cup together. We miss the conversation in the lobby or courtyard, before and after, all the stuff of human experience that digital connections try but fail to replicate. Technology is doing a fine job keeping us pseudo-connected in this time, but it’s shortcomings are also becoming abundantly clear.

Julie Masson: I miss the atmosphere of being in a room with voices worshipping together. You can’t replicate the sound of the person behind you singing slightly off-key or the visual of the girl in front who is raising her hands and swaying. All five senses seem to be irreplaceable in a virtual setting.

John Dyer: The first things that come to mind are all the little accidental things that happen with physical proximity—reading the face of someone you haven’t seen in a while and knowing you need to go up to them, feeling the room react to a point (or joke) in a sermon, hearing someone else’s baby who’s not on mute. At the same time, I think it’s important to acknowledge that there are already elements of our in-person gathering that technology has replaced, but not replicated. An example of this is online giving, which is so helpful for churches in the summer months, but which also hides the spiritual practice of bringing money every week and the communal practice of seeing our brothers and sisters give together.

 I’m not that concerned that we use our technology too much, I am concerned that we use it with too little reflection on how its form shapes our message. – John Dyer

JT:What are some of the best practices you have seen in regards to technology and the church in this season?

JK: It feels a little archaic even saying it but using phones as a listening/talking technology has proven itself to be a beneficial practice during this season, at least in our context and community. Turning away from the lure of social and news media, even texting, and picking up the phone to call someone has become a way of focusing our energy on little things that go a long way. I’ve tried to call people in our church community several times a week throughout this time of sheltering-in-place.

Before the coronavirus, it was mostly emails and texts from me. But now, exhausted by the digital disconnect, being able to focus solely on a voice without the added element of video and text has become a respite. And for some in our community, phone calls are so rare these days that receiving one is almost akin to receiving a hand-written letter in the mailbox; there’s been something surprisingly pleasant about it. Aside from the phone calls, the chat feature during online gatherings has been a helpful tool in creating at least some form of interaction as we “gather” in online spaces.

JD: Churches that had previously built their Sunday gathering times around highly commodifiable elements—three fast songs at pitches only professionals can reach, four minutes of video announcements, two slow songs, a sermon, etc.—were probably most prepared to enable those to be consumed online. What is rarer are churches that have intentional times of silence and prayer, songs that people and families can sing, and interactive elements that bring people out of the “watching church” mode.

JT: In what ways can churches bear the burdens of those who are still unable to attend in-person gatherings for a while due to this virus?

JK: One of the most encouraging and inspiring things I’ve seen come from this season has been the way so many have given their time, energy, and resources to come alongside the most vulnerable and needy in our midst. From picking up and dropping groceries to gardening to delivering meals, I’ve seen people bearing one another’s burdens in very visceral, real-time, real-life ways; an analog leaning, if you will. In some ways, this is one of the simplest and most powerful ways for us to truly be the church.

On an ecclesiological level, one of the most encouraging things I’ve experienced is how this pandemic has unified church leaders. Every Tuesday I’m on a Zoom call with dozens of others serving and leading local churches in the Silicon Valley and greater Bay Area. We pray for one another, share best practices, express specific needs, etc. Much has come of this; specific, pragmatic help from one church to another, as well as constant prayer for each other, and a unified plan for reopening, even though the rollout of that plan will look different from church to church. 

I’m hopeful that the church can and will continue to leverage technology. But as we do, we must never forget that the information must always point toward an invitation into embodied realities. – Jay Kim

JM: Overcommunication is key. Our church leaders have done a great job of sending weekly email updates to members, and they keep emphasizing how people can connect with the pastors, what is and isn’t happening in the church, and encouraging people to reach out to their small group members. This same information is repeated in different formats on social media. Overcommunication will help people feel like they know who to reach out to for help and how to be connected to the church while remaining at home. 

JD: I see congregations doing all kinds of wonderful work through activities like grocery shopping for those who can’t go out, sharing favorite local restaurants, supporting healthcare workers at nearby hospitals, and holding outdoor gatherings. On a more personal level, I’ve also found that returning to phone calls has been particularly meaningful. One incredible tool is SoundOfYourLove.com which allows friends and family of those quarantined in the hospital to record a soundtrack of messages to give patients hope and connection.

JT: Help us understand some of the dangers of technology in the church and what we might do to avoid abusing these tools or relying upon them too much.

JK: Digital technology often values speed, choice, and individualism. Everything is always getting faster (speed), the options are vast and endless (choice), and our entire experience is customized to our personal preferences and personalities (individualism). When we’re not careful, these values can turn in on themselves and become not only counterproductive but also quite dangerous. Speed can make us impatient, choice can make us shallow, and individualism can make us isolated. 

When we find ourselves relying on these tools too much, and our reliance goes unchecked for too long, these values inevitably form us into an increasingly impatient, shallow, isolated people—and the danger here for followers of Jesus is that discipleship is actually a patient, deep, communal work. Awareness of the subtle, subversive, and dangerous ways our use of these technologies is forming us is step one. Implementing defined limits and parameters for use is step two.

JD: I think we need to relentlessly challenge a way of thinking that’s deeply wired into the circuitry of evangelical thinking on technology: “the methods may change, but the message stays the same.” On the surface, this seems right because the gospel seed can grow in the soil of any culture. But this way of thinking also seems to say that form doesn’t matter, that our faith is simply content that can be delivered in any medium, and that beauty, truth, and goodness are separate things. So I’m not that concerned that we use our technology too much, I am concerned that we use it with too little reflection on how its form shapes our message. Instead of using whatever shiny thing we see on Twitter, we have to think intentionally about using form and content together to shape our bodies, souls, messages, and communities.

JT: How might God use technology to further the mission of the church in the coming years?

JM: It reminds me of how people used to believe that ebooks would end up destroying the print book. They were ultimately wrong because ebooks, while convenient, have primarily served to increase the desire for physical books. I agree with others about a similar parallel with online church services driving a greater desire for in-person church gatherings. I hope that more churches will keep an eye toward accessibility, and perhaps, those who were not streaming their services will start so that shut-ins or those who are sick can still partake in a part of the service, even if virtually. 

JD: In the opening chapters of Genesis, God says that our creativity is part of our image-bearing and part of our call to have dominion over, cultivate, and care for his creation. In the center of the story is Jesus, who is a second Adam, both by perfectly following the Law and by being a tektōn, a carpenter, who cultivated the Garden and the Temple, and who died on a hideous machine made from the very tools with which he worked. And at the end of the story, we see a resurrected Jesus bringing down from heaven a new city, a holy city, full of all of the things humans make—swords beaten into plowshares, roads paved with gold, trumpets filled with music, and gates in all directions. I think this means that technology and human creativity are not just a means for telling the story, but they are part of the story. I enjoy my work building things like Bible software for closed countries, online education platforms for seminaries, and other tools.

JK: A helpful differentiating line between digital and analog realities has been the divide between information and transformation. Digital technologies offer us incredible opportunities to inform people. And information is undoubtedly an important element to sharing the gospel. But ultimately, the mission of the church does not stop at information but rather, transformation—to be remade day by day into the image of the risen Christ. The work of transformation, I believe, is always an embodied, incarnational work. It’s communal too, in the sense that we cannot do it alone. 

We are not saved as individuals headed for a far off place called heaven. We are saved into a family, called to embody heaven’s reality, the rule and reign of Christ as King, here and now, as we look forward together toward the day when Christ shall return and right every wrong. I’m hopeful that the church can and will continue to leverage technology to inform the world of kingdom possibilities in compelling ways. But as we do, we must never forget that the information must always point toward an invitation into embodied realities, where we gather together as the people of God to be transformed in real ways, in real time, and in real space.

By / Mar 3

Jason Thacker, of the Ethics & Religious Liberty Commission of the Southern Baptist Convention, released his new book today, “The Age of AI: Artificial Intelligence and the Future of Humanity,” with Zondervan

In “The Age of AI,” Thacker, who serves as creative director and associate research fellow at the ERLC, helps readers navigate the digital age by providing a thoughtful exploration of the social, moral and ethical challenges of ongoing interactions with artificial intelligence.

“I don’t fear AI,” said Thacker. “Rather, I fear the people of God buying the lie that we are nothing more than machines and that, somehow, AI will usher in a utopian age. AI is not a savior. It is not going to fix all of our world’s problems. It is a tool that must be wielded with wisdom.” 

“The Age of AI,” serves as a guide for those wary of technology’s impact on society and also for those who are enthusiastic about the direction of AI in the culture. In the book, Thacker explains how AI affects individuals–in relationships and in society at large–as he addresses AI’s impact on people’s bodies, sexuality, work, warfare, economics, medicine, parenting and privacy. 

With theological depth and an expert awareness of the current trends in AI, Thacker is a steady guide, reminding readers that while AI is changing most things, it does not need to compromise our human dignity.

Richard Mouw, president emeritus and professor of faith and public life at Fuller Theological Seminary, provided the foreword for the book. 

Russell Moore, president of the ERLC, comments on Thacker’s new work.

“No ethical issue keeps me up at night as does the question of artificial intelligence. The reason for my dismay is that the church doesn't seem to be thinking very deeply about these matters at all, even as we move into a technological revolution that could prove to be Gutenberg-level in its implications. This book is a balm for anxiety in the age of technological disruption. No evangelical has thought and written more clearly on these matters than Jason Thacker. In this monumental work, he avoids both naïveté and paranoia about AI. The years ahead will require wise Christians in a time of smart robots. This book shows the way.”

“The Age of AI” has been endorsed by several public figures including former Florida Gov. Jeb Bush; Republican Sen. Ben Sasse of Nebraska; technologist and VMware CEO Pat Gelsinger; and theologian R. Albert Mohler, Jr., president of The Southern Baptist Theological Seminary.

Gov. Bush said, “Harnessing technology in our world, especially in education and medicine, can help us live productive and fulfilling lives. Yet Thacker reminds us that we must learn to do so in ways that glorify God and protect the most innocent among us. Great read for parents!”

ABOUT THE AUTHOR: Jason Thacker serves as associate research fellow and creative director at the Ethics and Religious Liberty Commission. He is a graduate of the University of Tennessee in Knoxville, Tenn., and holds a master of divinity from the Southern Baptist Theological Seminary in Louisville, Ky. His work has been featured at Christianity Today, The Gospel Coalition, Slate, and Politico. Jason and his wife, Dorie, live outside of Nashville with their two sons. 

To request an interview with Jason Thacker contact Elizabeth Bristow by email at [email protected] or call 202-547-0209. 

By / Feb 24

In the fall of 2018, Elon Musk made headlines once again. This time it wasn’t about his commercial rocket company, SpaceX, or his popular electric car company, Tesla. During an interview with Axios, a popular news service, Musk, referencing Darwin’s theory of evolution, declared that humanity must merge with AI in order to avoid becoming like the monkeys, which humans surpassed in complexity and might.

Musk’s plan for humanity includes adding a chip into our heads to upgrade our mental capacities, allowing us to keep up with the intelligence of future AIs as well as stopping bad actors on the world stage from hoarding all of the world’s information. But Musk is just one of the latest popular figures to propose a theory that has been around for generations: transhumanism.

Transhumanism is the term for humanity’s upgrading its abilities, both physical and mental. Known as the father of transhumanism, Julian Huxley, brother of famed writer Aldous Huxley, describes this concept in “Transhumanism,” his popular 1957 essay: “The human species can, if it wishes, transcend itself—not just sporadically, an individual here in one way, an individual there in another way, but in its entirety, as humanity.”

Huxley’s prediction that humans will upgrade themselves in fundamental ways might already be more of a reality than you’d think. As I was researching for my book, The Age of AI, I ran across some of the most interesting and mindboggling uses of AI in the medical field that I had ever seen.

AI is now being used in prosthetic limbs to help amputees or those born with disabilities live normal lives. From mind-controlled units to limbs that use advanced AI to become aware of the environment they are being used in, prosthetics have become extremely advanced in the last decade. Samantha Payne of Open Bionics, a UK-based robotics firm, says that her company has “had people say they’re tempted to replace healthy limbs with bionic ones.”

This desire to upgrade our bodies, even when the upgrades aren’t medically needed, is going to be more of a temptation in our society with each advance of AI and robotics. Deep down each of us knows that our bodies and minds are not ultimate. There is something lacking in us. This realization leads us to try to create something better than ourselves. But with the rise of AI, we now believe that we can make ourselves better by becoming partly machines.

This desire is nothing new; it has been part of science fiction for years. George Lucas, for example, popularized it in his Star Wars movie series, in which Luke Skywalker is given a robotic arm after losing his flesh-and-blood arm in his battle with Darth Vader, who himself is “more machine now than man,” according to Obi-Wan Kenobi. And today a robotic arm like Luke’s is a reality: the Life Under Kinetic Evolution (LUKE) arm has become the first muscle-controlled prosthetic to be cleared by the United States FDA.

But it should be noted that many of these AI-enhanced medical techniques are prohibitively expensive for most people. Innovation often is so expensive because it requires a lot of time and resources to develop. The LUKE arm can cost upward of $150,000, not including the cost of rehabilitation and medical care. Often innovative medical treatments are not covered by insurance. But the hope is that as technology becomes cheaper, the costs will decrease, making restorative AI uses available to more people. 

For all of the potential benefits brought by advances in artificial intelligence, there are also some great dangers that we must be aware of in this age of AI. The transhumanist line of thinking will quickly lead to humans being treated like pieces of flesh to be manipulated in search of some upgrade to become greater than ourselves. In this pursuit, it will be easy to regard as less than human those who have no clear societal value. If we successfully upgrade ourselves, a new disparity between the haves and have-nots will appear.

An unfettered hope in our ability to fix the world’s problems through technology will end only in heartbreak and broken bodies. We were not designed to carry that weight or responsibility. We are not gods, but we were made like the one who created everything. We are not able to fundamentally upgrade ourselves because we already are God’s crowning achievement in creation (Eph. 2:10). If we belong to God, there is nothing lacking in us.

Christians should be the first to say to our culture that every life has value and that all human beings deserve our love and care. We should pursue advances with a mindset and ethic that is not just human focused but grounded in something greater than ourselves: the imago Dei.

While we should pursue technological innovation to help push back the effects of the fall on our bodies, we should not seek to keep up with the machines, because they are never going to rival us in dignity and worth in the eyes of God. Our machines will increasingly have abilities that surpass ours, but they never will achieve dignity on a par with ours. God proclaims that we are not the sum of our parts, nor are we just bodies that should be upgraded at will.

Though the use of AI in medicine can be a slippery slope, we will continue to pursue it because of its benefits. 

The questions before us are, What moral guidelines should we give these systems? And how should they be used in society?

We must have clear minds and convictions as we develop and use technology in medicine. We must remember that these tools are gifts from God and that they can and should be used to save lives. Because every human life, from the smallest embryo to the woman with dementia in her old age, is made in the image of God, each person is infinitely worthy and deserving of our love, care, and respect.

We should pursue AI medical technology as a reminder of God’s good gifts to help us engage and love a world that has been ravaged by sin and destruction. With artificial intelligence, we will have new abilities to save human life. But we must not misuse these tools to favor one group of people over another or fool ourselves into thinking we can transcend our natural limitations. These are no more than feeble attempts to play God.

Christians should be the first to say to our culture that every life has value and that all human beings deserve our love and care. We should pursue advances with a mindset and ethic that is not just human focused but grounded in something greater than ourselves: the imago Dei.

Taken from The Age of AI by Jason Thacker. Copyright © 2020 by Jason Thacker. Used by permission of Zondervan. www.zondervan.com.