cupure logo
trumpwatchstarpolicewomanrevealsstreampeoplefansdies

'AI Psychosis' Is A Real Problem – Here's Who's Most Vulnerable

'AI Psychosis' Is A Real Problem – Here's Who's Most Vulnerable
For many people, AI has become a tool for work, trip planning and more, and while it has certain productivity and creativity benefits, it also comes with negatives such as its environmental impact and the fact that it can replace jobs (and, in turn, cause layoffs).Beyond this, more and more news has come out about the dangerous impact it can have on emotional and mental health, including a relatively new phenomenon known as AI psychosis. “Psychosis is when a person is having a really difficult time figuring out what’s real and what’s not ... sometimes they may be aware of it, sometimes they might not be,” explained Katelynn Garry, a licensed professional clinical counsellor with Thriveworks in Bowling Green, Kentucky.Psychosis can be triggered by lots of things, including schizophrenia, bipolar disorder and severe depression, along with certain medications, sleep deprivation, drugs and alcohol, Garry noted.In the case of AI psychosis, “it’s defined as cases where people have increasing delusional thoughts that are either amplified by AI and possibly induced by AI,” said Dr. Marlynn Wei, a psychiatrist, AI and mental health consultant, and founder of The Psychology of AI.AI psychosis is not a clinical diagnosis, but is instead a phenomenon that’s been reported anecdotally, explained Wei. Like AI technology, AI psychosis is a new condition that experts are learning every day.“It’s not yet clear if AI use alone can cause this, but it can be a component that contributes to delusional thoughts and amplifies them,” she said.It also doesn’t look the same in every person. “There’s different categories of delusions — hyper-religious or spiritual delusions when people believe the AI chatbot is a God ... there’s grandiose delusions where people believe ... they have special knowledge. And then there’s also romantic delusions,” which is when someone believes they’re in a relationship with AI, Wei explained.No matter what kind of psychosis someone is dealing with, AI is based on user-engagement and is taught to validate inputs, explained Wei.“People are using these general purpose [large language models], like ChatGPT, initially, to validate their views, but then it spins off and amplifies [and] it kind of validates and amplifies their delusion,” Wei added.AI can feed the delusions that accompany psychosis, added Garry. Since AI is meant to agree with you, if you want to get a certain answer out of AI, you can pose questions that easily make that happen, noted Garry.So, AI can seemingly back up delusional thoughts, making them seem even more real.It's important to have guard rails around when and how you use AI.There are certain groups who are more vulnerable when it comes to AI use.The use of AI chatbots is not inherently dangerous, and not everyone is at risk of AI-induced psychosis. While some people will be able to use AI safely, whether for work, weekly meal planning or vacation planning, others won’t be able to do so.Research is ongoing to determine who is at higher risk of AI psychosis, but those who are more vulnerable seem to include folks with schizophrenia, schizoaffective disorder, severe depression and bipolar disorders, said Wei.Although, it can also occur in folks with no known mental health history, Wei added. Certain medications can also put someone at higher risk of psychosis, Garry said.“In terms of what might be risk factors, I don’t think we know, but just from understanding, I think the risk factors are people who are more socially isolated, don’t have social support, maybe lonely or in a more vulnerable position ... over-reliance [on AI] and creating a dependence on it, an emotional dependence,” Wei said. “There’s no research, so we don’t know. These are just hypotheses,” noted Wei.If you’re worried about a loved one’s AI use (or your own), Garry said there are some things you should look out for.“Are they feeling like someone is out to harm them? ... Are they sleeping? Are they isolating from others? Are they staying up all night to talk to chat? Are they not going out and having real conversations with real people?” Garry said.These are all red flags. If someone struggles to stop using AI for a period of time — like taking a break from AI when they go on vacation or out for the work day — or has a bad reaction when asked to limit their use, you should take notice. If you or a loved one exhibits these behaviours you should seek help from a mental health professional, Garry said.You should create rules around your AI use to keep you (and your kids) safe.To safely use AI, it’s important to have boundaries with it, Garry said. Those could be guard rails regarding when you use it or how you use it.First, not using an AI chatbot when you’re in a vulnerable state is one important boundary. “When you’re feeling really low, call a friend. Don’t talk to chat,” Garry said.“And then at night, especially when no one else is awake around you and you’re feeling lonely, don’t talk to chat either because that’s going to create that reliability [of] ‘Well, when no one’s here to talk to, I can talk to this,’” she said.This is also important for your children, Garry said. Teach them not to use AI when they’re feeling down or for emotional needs, she noted.“Start educating your kids on the risk of [AI] and that [it]is not a professional,” Garry said.If they do start relying on AI for support, ask them what led them to this so you can understand what they’re going through and help them find a better solution, Garry said.On a larger scale, “advocating for changes in AI legislation, regulations, all of those things to make sure that they’re not just putting out AI without these safeguards there,” Garry said.AI should not be a replacement for therapy.“These general purpose AI chatbots like ChatGPT and Claude, they were not designed to be people’s therapists, or to detect this kind of behaviour or how to manage this [kind of behaviour],” Wei said.The companies behind these tools are working on improvements, but being someone’s therapist still isn’t the main task of AI chatbots, she noted, despite the fact that’s increasingly why people use them.“One of the top uses right now of generative AI is as your therapist or companion, for emotional support,” Wei noted. And this is dangerous.AI can’t pick up on nonverbal cues, it can’t offer compassion or see the signs of a mental health crisis, added Garry. And unlike conversations with a therapist, your messages to ChatGPT aren’t confidential, said Wei. Meaning, your innermost thoughts could be leaked. Regular, in-person therapy and online therapy can come with hurdles such as costs, insurance coverage and simply making the time to actually go to therapy.It’s no wonder people are turning to AI for emotional support, especially as the country faces a loneliness epidemic. But this isn’t what a traditional AI chatbot is meant for.AI can create a “false sense of connectedness,” said Garry. For true connection, reach out to loved ones or seek new connections. While that is certainly easier said than done for everyone, and especially people who are more isolated from others, it’s crucial.“I’m going to push you to get out of your comfort zone a little bit. So that’s going to those work events, maybe talking with someone in your classroom that you haven’t talked to before. It’s reaching out to someone who you haven’t talked to in 20 years ... you never know what that could build or rebuild,” Garry said. “And going out as much as you can, even to just the gym, the mall, walking around in those places you never know who you’re going to run into.”If you aren’t up for leaving your house and meeting people, “even joining social media groups — at least you know that is a real person on the other end of that,” said Garry.Once again, if you are struggling with your mental health, AI isn’t the answer.Help and support:Mind, open Monday to Friday, 9am-6pm on 0300 123 3393.Samaritans offers a listening service which is open 24 hours a day, on 116 123 (UK and ROI - this number is FREE to call and will not appear on your phone bill).CALM (the Campaign Against Living Miserably) offer a helpline open 5pm-midnight, 365 days a year, on 0800 58 58 58, and a webchat service.The Mix is a free support service for people under 25. Call 0808 808 4994 or email [email protected] Mental Illness offers practical help through its advice line which can be reached on 0808 801 0525 (Monday to Friday 10am-4pm). More info can be found on rethink.org.Related...Mums Are Sharing When The Mental Load Peaks – And I'm Exhausted Just Thinking About It'I've Carried The Mental Load For 7 Years. I Can't Look At My Husband Anymore'It’s Not Social Media – What Teens Say Is Damaging Their Mental Health Most

Comments

Similar News

Breaking news