Prof. John Wyatt is Emeritus Professor of Neonatal Paediatrics, University College London; presently serves as Senior Researcher at Faraday Institute, University of Cambridge, leading a multi-disciplinary research project funded by the Templeton World Charity Foundation into the social and theological implications of advances in artificial intelligence and robotics technology, based at the Faraday Institute.
At the seminar, Prof. Wyatt reminded us that artificial intelligence, robotics and information technology are changing the nature of our world, our society and the way we interact as humans. Billions of dollars are being invested in military, industrial, commercial and personal robotics, and artificial intelligence systems are being introduced in many areas of employment.
One commentator said that the workers of the future would be divided into two categories - those who are good at working with intelligent machines and those who are replaced by intelligent machines. Estimates of the number of jobs at risk from automation vary from 35% (in UK) to 77% (in China). Intelligent systems are entering healthcare, providing diagnostic and therapeutic advice for physicians and healthcare professionals together with "companion robots" for elderly and vulnerable individuals.
These developments have led to new dystopic fears about the possibility that advanced intelligent systems may threaten the future of the human race.
It is possible to trace several underlying trends behind the relationships between humans and intelligent machines. First, there is a movement from the machine to the human. Increasingly we understand ourselves as machines made out of flesh. Secondly there is a movement from the human to the machine. We project our humanity onto the machine in the powerful and largely unconscious process of anthropomorphism. This ability is part of our humanity and brings profound positive benefits. But at the same time, it renders us open to manipulation and deception.
In the Christian tradition created representations have varied meanings. On the one hand, there is the “icon” - a representation of the human form which points towards a hidden and greater reality. On the other hand, there is the “idol” - a false representation which misleads and deceives. The challenge for Christian believers will be to discern the distinction between these human representations in machine intelligence.
From Christian thinking, human beings are unique in being created in God's image. Just as the Holy Trinity is revealed as three persons in union and communion, so we too carry personhood. We are created to participate in I-you relationships as well as I-it interactions. One of the troubling issues raised by advanced computer simulations of humanity is that the distinction between I-you and I-it will become increasingly blurred.
In the Nicene Creed, the Church Fathers developed the formula that Jesus Christ was born not made. This is a critical and profound distinction. That which we make is a product of our will and is ours to control and manipulate. That which we give birth to is a gift from our being and is equal to us in dignity and in status. We make intelligent machines, but these remains distinct from the children whom we give birth to.
The challenge for Christian believers will be to develop positive responses and openness to the opportunities of advanced technology whilst developing resistance and resilience against the de-humanizing and manipulative forces that may be unleashed.
During the Q&A session, the followings points were strongly recommended:
a. That Christians should demonstrate treating other human beings as persons, sharing our relationships with God and one another as fellowmen, not “machines made of flesh”;
b. To distinguish carefully the virtual world or human representations in intelligent machines from the reality;
c. To be aware of the evil nature of human beings that the machines so-created would turn out to be “evil tools”;
d. To save ourselves from being AI addicts, by learning to “fast” from advanced technology, and avoid being dominated or manipulated by machines.
Dr. NG Tze-Ming, Peter
Lumina College Research Institute