Pope Francis pinpoints moral dangers of 'amazing and powerful' AI

robotic white hand

(Unsplash/Possessed Photography) 

by Michael Sean Winters

View Author Profile

The pope's World Day of Peace messages come each year at a time when the world is not so much peaceful as busy. Issued in the weeks before Christmas in anticipation of New Year's, these messages are often lost. This year's message, which focused on artificial intelligence, is too important to get lost. 

The phrase "artificial intelligence" is itself a kind of oxymoron. As the pope states, "To speak in the plural of 'forms of intelligence’ can help to emphasize above all the unbridgeable gap between such systems, however amazing and powerful, and the human person: in the end, they are merely 'fragmentary', in the sense that they can only imitate or reproduce certain functions of human intelligence." Put differently, artificial intelligence is itself the outcome of human intelligence. We created it; it does not create us. 

Those words reminded me of Professor Paul Weiss, my philosophy teacher many years ago. In his course on metaphysics, he told us that computers cannot think if thinking is what we humans do. "A computer cannot make a period at the end of the sentence, it can only make a dot," he told us. "It cannot know that the period is ordered to the capital letter at the beginning of the sentence or, as was the case with the great poet e.e.cummings, that it is not so ordered." Forty years on, I still remember those words as words freshly spoken. 

Information technology may not be creative like a poet, but it can be as manipulative as the emotions the poets evoke. "Moreover, from the digital footprints spread throughout the Internet, technologies employing a variety of algorithms can extract data that enable them to control mental and relational habits for commercial or political purposes, often without our knowledge, thus limiting our conscious exercise of freedom of choice," the pope writes. "In a space like the Web, marked by information overload, they can structure the flow of data according to criteria of selection that are not always perceived by the user."

The verb "control" jumps off the page. We have become reliant on information technologies for so much of what we do, even if we rarely understand how those technologies work, still less how they can and do manipulate our own ethical choices.

Hallway with computer screens lining walls

(Unsplash/Boba Jovanovic) 

"Information overload" is an almost insoluble problem. If it is true that computer technologies can only do what they have been programmed to do, it is also the case that participation in modern society requires us to rely on computers and social media to provide us with information that is itself programmed. And the programming is done by people with interests and sometimes with ideological objectives that are opaque to those of us consuming the information. Nor is transparency enough, although it would be a step forward. How many people would take the time to read the disclaimers when busy searching for information they need? And, information is not like other human products. You can do without a dishwasher and wash your dishes by hand. There are a variety of means by which you can heat your house or get to work. But who can get through the day with access to information? 

The pope offers some cultural proposals about how to confront the problems posed by AI. He calls for "an appropriate formation in responsibility for [technology's] future development. Freedom and peaceful coexistence are threatened whenever human beings yield to the temptation to selfishness, self-interest, the desire for profit and the thirst for power." He sets the essential benchmark for harnessing technology to human flourishing: "The inherent dignity of each human being and the fraternity that binds us together as members of the one human family must undergird the development of new technologies and serve as indisputable criteria for evaluating them before they are employed, so that digital progress can occur with due respect for justice and contribute to the cause of peace." 

The message shows a highly informed awareness of the varied problems raised by AI and other advanced technologies. The pope points to "a serious problem when artificial intelligence is deployed in campaigns of disinformation that spread false news and lead to a growing distrust of the communications media." He also highlights "other negative consequences of the misuse of these technologies, such as discrimination, interference in elections, the rise of a surveillance society, digital exclusion and the exacerbation of an individualism increasingly disconnected from society."

And Pope Francis echoes the concern about technocratic paradigms that he identified in Laudato Si’ nine years ago. "Human beings are, by definition, mortal; by proposing to overcome every limit through technology, in an obsessive desire to control everything, we risk losing control over ourselves; in the quest for an absolute freedom, we risk falling into the spiral of a 'technological dictatorship,' " he writes. "Recognizing and accepting our limits as creatures is an indispensable condition for reaching, or better, welcoming fulfilment as a gift."  

The pope considers the truly frightening applicability of these advanced technologies to war-making, as well as their potential to accelerate our already distorted understanding of human labor in the well-being of society as merely a question of efficiency. "Jobs that were once the sole domain of human labour are rapidly being taken over by industrial applications of artificial intelligence," he writes. "Respect for the dignity of labourers and the importance of employment for the economic well-being of individuals, families, and societies, for job security and just wages, ought to be a high priority for the international community as these forms of technology penetrate more deeply into our workplaces."

There is no doubt that society must take steps to confront these dangers. But who? And how? The market is more the problem than the solution. The pope urges "the global community of nations to work together in order to adopt a binding international treaty that regulates the development and use of artificial intelligence in its many forms." The track record of the United Nations at preventing wars does not encourage confidence it could address AI, does it? The European Union recently reached accord on an act to regulate AI, but rogue nations like Iran, Russia and North Korea will not be bound by any international rules. A bipartisan group of senators released a bill in November, but the U.S. Congress can't seem to reach agreement on anything these days.

The pope is not to blame for the dysfunction of world politics, so no one should fault this document for being stronger in the diagnosis than in the cure. What is remarkable about Pope Francis, however, is his ability to think through such complex issues and discern the moral conundrums they present. 

AI is difficult to understand, but recognizing the need to regulate it entails no such difficulty. AI may only do what we can program it to do, but it can execute those programs at a speed with which our poor minds can't keep up. That alone should cause worry. AI compensates for its lack of creativity with efficiency on steroids. Steroids are exceedingly dangerous. 

The deeper problem is that, once again, mankind's capacity for moral advancement seems incapable of keeping up with our capacity for technological advancement. Pope Francis has issued the warning, framed the problems and offered a path forward that is humane. The question is whether our societies and cultures are capable of taking that path. 

Latest News

Advertisement