WHAT ARE THE EFFECTS OF LONG-TERM EXPOSURE TO ARTIFICIAL INTELLIGENCE (AI) INTERACTIONS ON EMOTIONAL INTELLIGENCE AND EMPATHY IN HUMANS?
DOI:
https://doi.org/10.53555/ephijse.v11i1.291Keywords:
AI, Emotional intelligence, Empathy, Human, Social, RelationAbstract
This paper reviews the long-term impacts of working with artificial intelligence (AI) systems to influence emotional intelligence (EI) and empathy. AI has the ability to augment emotional growth with suggestions for feeling, managing, and expressing emotions focused on those with emotional interaction difficulties. AI-based programs can assist users in engaging with emotionally distressing situations and promote users' emotional awareness and communication in their relationships. Nonetheless, reliance on AI may reduce face-to-face engagement, which is critical for deep emotional relationships, even though AI may imitate empathy and provide emotional support. AI cannot simulate the expansive range of human emotion and relational experiences derived from being human. Over time, and prolonged use of AI for emotional engagement, increases the chances of social disconnection and relation impoverishment. The bottom line is the use of AI systems should be for guidance rather than replacement of human interactions. Underlying this discussion is the need for emotional guidance through AI systems to be balanced with human interactions and relationships. Societies must remain conscious of AI as a rapidly changing technology designed mainly for emotional engagement and development, and the need for maintaining a human-centric model of emotional health and education.
References
Bar-On, R. (2006). The Bar-On model of emotional-social intelligence (ESI). Psicothema, 18 Suppl, 13–25. https://pubmed.ncbi.nlm.nih.gov/17295953/
Bickmore, T. W., & Picard, R. W. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction, 12(2), 293–327. https://doi.org/10.1145/1067860.1067867
Cohn, J. F., Zara Ambadar, & Ekman, P. (2007). Observer-Based Measurement of Facial Expression with the Facial Action Coding System. ResearchGate; unknown. https://www.researchgate.net/publication/242138961_Observer-Based_Measurement_of_Facial_Expression_with_the_Facial_Action_Coding_System
Dautenhahn, K. (2007). Socially Intelligent robots: Dimensions of Human–robot Interaction. Philosophical Transactions of the Royal Society B: Biological Sciences, 362(1480), 679–704. https://doi.org/10.1098/rstb.2006.2004
Emotional AI. (2024, December 13). SAGE Publications Ltd. https://uk.sagepub.com/en-gb/eur/emotional-ai/book251642#reviews
Fiske, S. T., Cuddy, A. J. C., & Glick, P. (2007). Universal dimensions of social cognition: warmth and competence. Trends in Cognitive Sciences, 11(2), 77–83. https://doi.org/10.1016/j.tics.2006.11.005
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health, 4(2). https://doi.org/10.2196/mental.7785
Guzman, A. L., & Lewis, S. C. (2020). Artificial intelligence and communication: A Human–Machine Communication research agenda. New Media & Society, 22(1), 146144481985869. https://doi.org/10.1177/1461444819858691
Luxton, D. D. (2016, January 1). Chapter 1 - An Introduction to Artificial Intelligence in Behavioral and Mental Health Care (D. D. Luxton, Ed.). ScienceDirect; Academic Press. https://www.sciencedirect.com/science/article/abs/pii/B9780124202481000015
Purington, A., Taft, J., Sannon, S., Bazarova, N., & Hardman Taylor, S. (n.d.). “Alexa is my new BFF”: Social Roles, User Satisfaction, and Personification of the Amazon Echo. Retrieved January 5, 2025, from https://bpb-us-e1.wpmucdn.com/blogs.cornell.edu/dist/1/8892/files/2013/12/Alexa_CHI_Revise_Submit-22ay4kx.pdf
Sharkey, A., & Sharkey, N. (2020). We need to talk about deception in social robotics! Ethics and Information Technology. https://doi.org/10.1007/s10676-020-09573-9