In the next graduate workshop, we will have Rachel Katz presenting her work “Compuitng and Counselling: Ethical Issues in AI-Facilitated Psychotherapy.” The workshop will take place on Wednesday, March 15th, 2023, from 12:00 to 1:30 pm (EST) at VC303, the IHPST Common Room/Lounge, on the third floor of Victoria College. This is a hybrid event. If you are planning to join the meeting via Zoom, please RSVP in advance through the following link: https://utoronto.zoom.us/meeting/register/tZ0odemoqjkqHdF_5wtmod9PYXgFq67gm7MG
Computing and Counselling: Ethical Issues in AI-Facilitated Psychotherapy
AI-facilitated psychotherapy has grown significantly in the last five years, with interventions that range from chatbots to interactive apps that make use of methods such as cognitive behavioural therapy (CBT). These apps and chatbots have their uses; they can serve as a stop-gap solution while a patient is either on a lengthy waiting list for in-person therapy services or act as a check-in point for patients between appointments. AI therapy bots can help a patient new to psychotherapy take their first steps as disclosing their symptoms, helping to reduce stigma. They can also help reduce clinician burnout by taking on the routine tasks of administering psychotherapy (such as patient intake) and reducing a clinician’s caseload.
However, there are consequences of the AI-ification of psychotherapy that have been underexplored in the philosophical literature. Does the outsourcing of some psychotherapy interventions actually address the ever-increasing gap between the supply of human psychotherapists and the patient demand for their services? How does the use of AI therapy tools affect patients’ sense of community (both with respect to the bioethical “disease community” and the patient’s more general community)? Additionally, most of the AI therapy tools use formulaic approaches to psychotherapy, such as CBT, a format that is demonstrably harmful for some mental health concerns. Will the outsourcing of psychotherapy to apps and chatbots leave more room for patients who require more intensive dialogical therapy, or will it force more patients to contort their therapeutic needs to be better served by CBT? How might public health units steer patients towards AI psychotherapy options or how may governments use these limited AI tools to completely replace human therapists in already underserved communities? These and other vital ethical questions must be discussed before these tools develop further.