Even in a mental health context, AI is not all bad, though. “We have to really understand the risks and the limitations while also honoring the benefits because there definitely are many,” says Pharaon. One is that AI is able to swiftly and efficiently review medical records and documents. Petersel gives one example of populating pages and pages of stream-of-consciousness journal entries into AI, then asking, “Can you tell me in a few sentences what you think my opinion is based on all these?”
But AI can only be as good as the prompts we provide. For example, as Petersel notes, if you tell the AI chatbot that you want to work on developing gratitude but aren’t sure how, it may come back with 10 suggestions. But what if one suggestion is to take a walk around the block and the block you live on happens to be unsafe? Or what if another suggestion is to reach out to your dad and you have a really complicated relationship with him? You need to have the discernment to make that judgment yourself, says Petersel. An experiment published earlier this year in PLOS Mental Health positioned ChatGPT against a human therapist; it found that participants could rarely tell the difference between responses written by ChatGPT and those by a therapist; the ChatGPT responses were also rated higher.
While studies like these raise considerations for the integration of AI in therapeutic settings with the appropriate boundaries and extensive oversight, of course there are still concerns. Chatbots don’t understand nuance, says Pharaon, nor can they provide the same context for a situation that a real therapist can. There is also an immediacy to the AI approach that, while appealing in theory, says Donahue, doesn’t allow for healing from grief or trauma on an appropriate human timeline. While the AI advice may feel right in the moment, what does this approach do for clients long-term? “When this level of insight is right in our pocket, what impact does that have on our own ability to look within ourselves and make an informed decision based on our own experiences and values?” Petersel wonders.
Building these relationships with AI and seeking out comfort from a chatbot instead of an actual person can reinforce a sense of isolation and impact our ability to connect IRL. While the threat of artificial intelligence to human intelligence is being discussed ever more frequently (particularly in the context of education and how this technology is decimating our ability to learn and think critically), just as important is considering the dangers it poses for human connection. And connection is something that is central to therapy and its efficacy.
“On a chemical, physical, and energetic level, being in the presence of another human is a big part of care,” says Petersel, “even virtually.” That connection can be especially powerful for those who feel disconnected from their own emotional response, says Donahue. While there can be a therapeutic place for AI, it should not, according to experts, ever be someone’s sole provider. Something to remember as the current administration continues to give carte blanche to big tech companies. (Most recently, Republicans sneakily inserted language into a budget reconciliation bill banning states from regulating AI in any capacity for an entire decade.)
AI is not all good nor all bad, but it won’t ever be able to replace the value of human connection in therapy. “AI really shortchanges someone from experiencing the beauty and complexity of growing and learning through human relationships,” says Donahue. “We need human connection to survive. People need people, period.”
#Replace #Therapists #Importantly