People with social anxiety are more likely to become overly dependent on conversational AI agents

People with social anxiety are more likely to become overly dependent on conversational AI agents

A recent study examined how social anxiety, loneliness, and rumination contribute to the problematic use of conversational AI agents. The results showed that people high in social anxiety are more likely to feel lonely and engage in ruminative thought patterns. So they tend to turn to conversational AI agents for relief. The study was published in Computers in human behavior.

Conversational artificial intelligence (AI) agents are pieces of software designed to engage in human-like conversations with human users. With recent advances in natural language processing and machine learning, conversational AI agents are becoming increasingly popular. These include software agents such as Apple Siri, Amazon Echo, ChatGPT and others. In a 2018 survey, 65% of US participants said virtual assistants had changed their behaviors and daily routines.

The flourishing of conversational AI has also presented an opportunity for problematic use of this technology, namely for individuals who become overly dependent on it. The personalized responses of Conversational Ais and the human nature of conversations can keep users glued to their devices longer and create a strong attachment to such pieces of technology.

Previous studies have shown that people high in social anxiety are more likely to develop problematic use of technology. It is entirely possible that the same is true with the problematic use of conversational AI. The problematic use of conversational AI is its overuse in an addictive way, which often causes undesirable consequences in daily life.

Study author Bo Hu and his colleagues said that socially anxious individuals can use conversational AI to compensate for the deficits in the social interactions they experience and, in this way, more easily develop problematic use. This effect might be expected to be greater in participants who attributed a human-like mind to the software agent (mental perception) as this would facilitate emotional attachment to it. The researchers also expected that rumination and loneliness would play a role in this link.

Participants were 516 conversational AI users recruited through crowdsourcing platform Wenjuanxing. Participants ranged in age from 18 to 59 years. 76% of them have a bachelor’s degree, while 14.5% have a master’s degree or higher.

Participants completed assessments of loneliness (a short version of the UCLA Loneliness Scale), rumination (the Ruminative Response Scale, short version), mind perception (5 items, e.g. use (the Bergen Social Media Addiction Scale, adjusted to cover conversational AI instead of social media, for example, I spend a lot of time thinking about conversational AI).

Rumination refers to a pattern of repetitive and intrusive thoughts about negative experiences, emotions, or problems. It involves dwelling on past events, analyzing them extensively, and repeating the same concerns over and over without reaching a solution or finding a way forward.

The results showed that individuals with higher social anxiety scores tend to be more lonely, more prone to rumination, and more likely to use conversational AI in problematic ways. Both loneliness and rumination have been positively associated with problematic use of conversational AI.

The researchers tested a model that claimed that social anxiety leads to loneliness, that loneliness leads to rumination, which in turn makes a person prone to problematic use of conversational AI. The results showed that such a model of relationships between the studied factors is actually possible. They also indicated that the effect of social anxiety on the problematic use of conversational AI was strongest when mind perception was high, i.e., when participants ascribed a human-like mind to the AI ​​agent.

As expected, the results revealed a positive association between social anxiety and problematic use of conversational AI. Socially anxious individuals may have difficulty engaging in interpersonal interactions and developing relationships with other people in face-to-face settings. Compared to human interaction, conversational AI can provide these individuals with a more comfortable and relaxed pseudo-interpersonal experience, which could ultimately make them feel in their comfort zone and lead them to become addicted to technology,” the authors wrote. of study. .

The study makes an important contribution to the scientific understanding of the psychological aspects of the use of AI. However, it also has limitations that need to be taken into consideration. In particular, the study design does not allow for conclusions of cause and effect. Additionally, all study participants were Chinese. Results on people from other cultures may not be the same.

The study, How Social Anxiety Leads to Problematic Use of Conversational AI: The Roles of Loneliness, Rumination, and Mental Perception, was written by Bo Hu, Yuanyi Mao, and Ki Joon Kim.

#People #social #anxiety #overly #dependent #conversational #agents

Previous articleWant to easily implement an open source LLM? Anyscale’s Aviary project takes off
Next articleNo AI in my classroom unless a human verifies its accuracy Ars Technica


Please enter your comment!
Please enter your name here