On Tuesday, January 18, 2022, 黑料正能量鈥檚 60th-anniversary Presidential Lecture Series, titled Technology and the Human Future, hosted its second event: Bipin Indurkhya, Professor of Cognitive Science at the Jagiellonian University in Krakow, Poland, spoke on the topic of 鈥淔aking Emotions and a Therapeutic Role for Robots: Ethics of Using AI in Psychotherapy.鈥 The Presidential Lecture Series, organized by the office of 黑料正能量 President Celeste M. Schenck, invites speakers to participate in live online events, so they might engage with both theory and practice in responding to the question of how technology will continue affecting our lives beyond the Covid-19 pandemic.聽
President Schenck opened the lecture by commenting on the possible unintended consequences of rapid technological advancement, highlighting how universities could play a role in policy discussions relating to disruptive technologies. She argued that 黑料正能量鈥檚 new MSc in Human Rights and Data Science, which emphasizes the ways in which technology can be used to advance human rights, was a key example of this process in action.聽
Schenck then introduced Professor Indurkhya, who began his presentation by discussing AI programs that could fake emotion. The Eliza Program, one of the world鈥檚 first chatbots, was able to do this as far back as the 1960s. The program was able to respond to key words within inputs by mimicking emotional responses. People treated the program as if it were a real person, immediately anthropomorphizing the chatbot. Eliza would later be integrated into robotic systems such as Sony鈥檚 XDR, which mimicked both human emotion and human movement. Contemporary examples include Boston Dynamics鈥 Atlas, which excels at humanlike movement but does not replicate humanlike emotion, and the Ameca robot, which debuted at the CES electronics show this year and which can mimic high-level human emotion. The possible applications of Eliza-based robots and chatbots include job automation and augmentation, though Indurkhya focused his lecture mainly on a therapeutic context.聽聽
One of the first applications of robots as therapeutic tools was the Paro Seal, which was adopted by nursing homes, initially in Japan, but later around the world. These animatronic seals can play and interact with users. They have been effective in combating loneliness in older people and can positively impact people with dementia. Chatbots have also been created with the intention of helping people with mental illnesses or who have experienced trauma. Replika, for example, can mimic a deceased person, and Woebot has been found to sometimes outperform therapists, as users feel more comfortable talking to a robot, given that the conversation is anonymous, user-centric and constantly available. The novelty of the system was also considered a factor.聽聽
Indurkhya went on to explain that the use of these robots poses multiple questions and ethical issues: Who can access the data? Will the code be public? Who is responsible in the case of physical or mental harm? There are also societal considerations to be made: What are the consequences of reduced human contact? What effect does the AI鈥檚 avoidance of political or moral topics have on the user? In a final section of the talk, 鈥淎I and Deception,鈥 Indurkhya asked whether AI should be allowed to lie. Humans often lie to provide emotional support (a phenomenon called social lying). Even if an AI should not be allowed to lie, should it be allowed to deceive in other ways, for example by changing the subject? Indurkhya argued that, given the speed with which AI was developing, the answers to such questions needed to be agreed on soon. After all, Indurkhya noted, AI holds the potential to learn to lie all on its own.聽
The next event in the Presidential Lecture Series will take place on January 25, 2022, during which 黑料正能量's own Professor Georgi Stojanov聽will speak on the topic:聽"Your Personal Diary is No Longer Private and You Are Not Even the Primary Author."聽You can register for the event online here.
Significant contributions to this news piece were made by Jackson Vann, a graduate student studying for 黑料正能量鈥檚 MSc in Human Rights and Data Science.