According to a report by the AI Safety Institute, an organisation part of the Department for Science, Innovation and Technology aiming to evaluate and ensure the safety of users using AI, a third of adults AI for emotional support or social interaction. Additionally, one in 25 people turned to the tech for support or conversation. The report is based on two years of testing the abilities of more than 30 unnamed advanced AIs – covering areas critical to security, including cyber skills, chemistry and biology.

A survey by AISI of over 2,000 UK adults found people were primarily using chatbots like ChatGPT for emotional support or social interaction, as well as by voice assistants like Amazon’s Alexa. They also found that in a subreddit (an online forum on a website called Reddit) dedicated to discussing AI companions, people reported that when their chatbots went down, they described symptoms of withdrawal, such as feeling anxious or depressed, disrupted sleep and neglecting their responsibilities.

The report discussed various issues with such a mindset including the ability to sway people’s political opinions, with the most persuasive AI models delivering “substantial” amounts of inaccurate information in the process.

Additionally, over-reliance on AI might reduce opportunities for real human connection, leading to increased feelings of loneliness and social isolation. Studies have also found that a quarter of teenagers in the UK have turned to AI chatbots for mental health support in the last year.

Relying on chatbots as a primary, significant or only source of communication is bad for several reasons, including a reduction in human contact, which is bad for mental health and the occasional inaccuracies of what a chatbot says, which is followed on, could have serious consequences.

A chatbot isn’t designed to pick up on social cues and emotions-it’s just there to answer questions. So, relying on it for human interaction isn’t going to work because it isn’t programmed to understand how to do it in a way that is truly helpful. If you are having trouble related to interacting with others, it’s better to reach out to real individuals, ideally in real life or sometimes online although be careful, especially with the latter.

 

Sources:
BBC News – One in three using AI for emotional support and conversation, UK says
The Guardian – Third of UK citizens have used AI for emotional support, research reveals
The Independent – Experts issue warning over people forming ‘emotional bonds’ with AI chatbots