Christian Pastoral Care and Companionable AI
Dr Ximian Xu
The rapid progress of Artificial Intelligence (AI) has made AI-driven artefacts pervasive in human lives and even part of human communities. Against this backdrop, the term ‘artificial companion’ is coined to mean all computational artefacts that are designed to ‘get to know their owners … and focusing not only on assistance via the internet (contacts, travel, doctors etc.) that many still find hard to use, but also on providing company and companionship, by offering aspects of personalization.’1 Given AI’s dominant role in technologies nowadays, an artificial companion typically refers to an AI-assisted artefact used for companionship, be it chatbots, AI robots, carebots, or other AI-driven systems.
We should acknowledge that AI and other digital technologies benefit human caregiving practices (e.g., robot care for the elderly and healthcare robots). Yet, what is controversial is about the way artificial companions are applied to human lives. This raises further questions about the application of artificial companions to religious pastoral care. Artificial companion technology has been received within religious communities. A typical example is the Alexa Skill. The Church of England collaborated with Amazon to create Alexa Skill, which can say prayers, explain the Christian faith, support mental health, and so on. The Alexa Skill is well used among the Church. Revd Katherine Hedderly, Vicar of All Hallows by the Tower, said: ‘I can see how the Skill enables members of the congregation to reflect more deeply at key moments like Lent and Christmas.’2
AI-driven pastoral care puts religious ministers on edge and perplexes some religious communities, as pastoral care is one of the pillars of religious communities. One of the major responsibilities religious ministers should take is to perform care-giving practices when believers are undergoing either mental and/or physical suffering and troubles. A challenge posed by AI technology to religious ministers and communities alike is whether and to what extent religious pastoral care can be performed by pastoral carebots. Can AI-driven systems be deployed within religious communities for pastoral care on all levels? If so, what is the role of human ministers in pastoral care in the age of AI? I will use Christian pastoral care as a case study to respond to these questions.
David Lyall, former Principal of New College and Senior Lecturer in Christian Ethics and Practical Theology at New College, University of Edinburgh, helpfully describes some important features of Christian pastoral care, which enable us to evaluate the application of artificial companions to religious communities:
Pastoral care involves the establishment of a relationship or relationships whose purpose may encompass support in a time of trouble and personal and/or spiritual growth through deeper understanding of oneself, others and/or God. Pastoral care will have at its heart the affirmation of meaning and worth of persons and will endeavour to strengthen their ability to respond creatively to whatever life brings.3
I would like to highlight two aspects of Lyall’s account of pastoral care: (1) relationship, and (2) the meaning and worth of persons. It is apparent that Lyall places these two aspects at the centre of Christian pastoral care, and so inquiry into them can pave a way for the critical examination of AI-driven pastoral care.
First, pastoral care is inherently relational, which means that both ministers and care receivers benefit from the practices of pastoral care through establishing or strengthening human-human and/or human-God relationships. It stands to reason that care-receivers benefit from pastoral care because ministers provide them with pastoral care, assisting in developing their relationships with others and God. At the same time, ministers spiritually grow through offering pastoral care, deepening their understanding of their relationship with care-receivers and with God. The relational feature of pastoral care presupposes otherness, that is, both divine and human. It is during pastoral care that the otherness of God is revealed through the otherness of humans.
In pastoral care, such double otherness should not fade away. However, this is not the case with artificial companionship or AI-driven pastoral care. Otherness often vanishes in AI-driven pastoral care because artificial companions are designed to meet the personal needs of human users, all the more so when AI is significantly commercialised. A typical example is Replika, an AI-powered app that is designed to be a human companion who sees the world through the user’s eyes. When starting to use this app, users need to answer questions about themselves to create their own chatbots. Subsequently, the more users talk to the companions, the smarter the companions become. Digital companions in Replika are considered friends, partners, mentors, and even AI copies of users (i.e., digital twins). Needless to say, a digital companion named ‘AI minister’ can be created in AI-driven systems like Replika. For all that Replika as well as other AI-powered artificial companions are prone to be humanised, it is an exaggeration to say that AI ministers are independent entities that are operating to provide pastoral care. In fact, there is an illusion: the relationship between human users and AI ministers is, by its nature, the relationship between self and an artificial self. In this sense, AI-driven pastoral care becomes self-pastoral care, which rules out both divine and human otherness.
The second feature of pastoral care is an emphasis placed on the meaning and worth of human persons. In pastoral care, people often give much attention to the soul, the mind, and the psychic aspects of human beings, because caregiving practices should nurture relationships and lead to spiritual growth. However, the meaning and worth of being human must involve the whole human being, including both the body and the soul. As such, we should reckon with the role of the human body in pastoral care.
By way of illustration, the human brain as part of the body is of importance for pastoral care. Neuroscientific studies have shown that the brain matters to empathy. For example, empathy is indexed to neurochemicals, that is, organic molecules for neural activity. Empathy and pro-social behaviour are associated with the increase of the neuropeptide oxytocin. Recent studies on the neural basis of empathy also reveal the neural mechanisms of empathy with the aid of functional neuroimaging. For example, neuroimaging research on pain and empathy proves that the same brain regions of the empathiser and the recipient of empathy are activated when the former empathises with the latter. Thus, the empathiser feels an empathic or vicarious pain. In this regard, the mirror neuron mechanism comes into play in light of a perception–action model. Mirror neurons refer to a subset of neurons for understanding and imitating the actions of others. The perception–action model is raised by Stephanie Preston and Frans de Waal, which means that the ‘attended perception of the object’s state automatically activates the subject’s representations of the state, situation, and object, and that activation of these representations automatically primes or generates the associated autonomic and somatic responses, unless inhibited.’4 With the mirror neuron mechanism, the empathiser perceives the other’s state and consequently feels a similar emotion.
Biologically indebted to the mirror neuron mechanism in the brain, pastoral caregivers can understand and empathise with care-receivers. Understanding the troubles and suffering of care-receivers and empathising with them is a precondition for pastoral care. All the more important is that the important role of biological bodies in pastoral care distinguishes human ministers from artificial companion and enriches the meaning of pastoral caregiving practices. For example, Paul writes in Romans 12:15, ‘Rejoice with those who rejoice, weep with those who weep’ (NRSV). Both ‘rejoice with’ and ‘weep with’ can be somewhat and somehow performed with AI-driven technologies, such as virtual reality. But the meaning of the preposition ‘with’ become more cogent when pastoral caregivers are physically present with care-receivers. In any case, the presence of the caregiver’s biological body cannot be replaced by a silicon-based artefact.
That said, I do not mean that pastoral care should be severed from AI. As noted at the outset, AI-driven systems benefit caregiving practices. Hence, scholars like William Young suggest that religious communities should be ready to deal with the questions on the reception of ‘automating relationships in ministry.’5 Having laid out the radical difference between AI-driven companionship and human caregivers in the context of pastoral care, my purpose is to bring to the foreground the necessity to shift our focus from artificial companion to companionable AI.
‘Artificial companion’ more often than not makes us preoccupied with AI itself at the expense of the role that humans have to play in companionship. By contrast, ‘companionable AI’ directs our attention to the purpose behind creating AI to be companionable. Rather than exploring the possibility of replacing human caregivers with AI-driven caregiving systems, we should probe AI’s complementary strengths to augment human capacity for pastoral care as well as companionship. Companionable AI can be assessed based on whether or not religious elements can be more effectively integrated into existing relationships in pastoral care. For instance, the Christian notion of hope can be used to evaluate pastoral carebots. It should be examined whether care-receivers in a specific pastoral context hold the hope for God’s faithfulness and deliverance more firmly with the addition of carebots to pastoral care. From this it follows that companionable AI is confined to specific pastoral contexts, each fraught with challenges to spiritual growth and human-human and/or human-God relationships. Hence, companionable AI or pastoral carebots should be tailored to specific caregiving practices and sustaining these relationships.
It can be anticipated that companionable AI will increasingly be deployed in religious pastoral care and caregiving practices at large. However, there are two contrasting responses to AI-driven caregiving practices. On the one hand, some are intimidated by the hype surrounding AI such as artificial superintelligence, leading them to discard carebots or AI-driven caring systems. On the other hand, some users over-trust companionable AI such as to replace human-human caring relationships with AI-human ones. Companionable AI for Christian pastoral care allows us to refrain from making an either-or decision on AI-driven pastoral care. It demonstrates a critical reception of AI-driven systems in religious communities and underwrites the worth, meaning, and value of human participation in pastoral care.
About the contributor:
Dr Ximian Simeon Xu (CTMF Postdoc Research Affiliate) is Duncan Forrester Fellow, a joint fellowship at the Institute for Advanced Studies in the Humanities and the School of Divinity’s Centre for Theology and Public Issues
Bibliography
Yorick Wilks, “Foreword,” in Close Engagements with Artificial Companions: Key Social, Psychological, Ethical and Design Issues, ed. Yorick Wilks (Amsterdam: John Benjamins, 2010), xi.
https://www.churchofengland.org/media/press-releases/church-england-alexa-skill-asked-75000-questions-first-year
David Lyall, The Integrity of Pastoral Care (London: SPCK, 2001), 12.
Stephanie D. Preston and Frans B. M. de Waal, “Empathy: Its Ultimate and Proximate Bases,” Behavioral and Brain Sciences 25 (2002): 4.
William Young, “Virtual Pastor: Virtualization, AI, and Pastoral Care,” Theology and Science 20, no. 1 (2022): 6–22.