+44 (0)115 857 3776
  • Blog
  • Success Stories
  • Portal Login
    • Portal V1
    • Portal V2
  • Senso Learn Login
  • Book A Demo
Book A DemoBook A Demo
Senso Cloud Logo
  • Products
  • Prices
  • Contact Us
  • Class Cloud

    Monitor and teach without distraction

  • Enterprise Cloud

    All of our powerful features in one bundle

  • Senso Learn

    New learning management system

  • One2one Program

    The latest tech for schools.

  • Senso Blur

    Instantly blurs inappropriate imagery online

  • Senso Scenes

    Smarter classroom control with scenes

  • Portal Login
  • Senso Learn Login
  • Products
  • Prices
  • Book A Demo
  • Contact Us
  • Blog
  • Your Platform
  • Success Stories
Banner Image

Spotlight on Linguistics: AI Chatbot Addiction

SAFEGUARDING

April 16th 2026 | 2min read

Using Linguistics to Uncover AI Chatbot Addiction

Following our recent exploration of emerging safeguarding trends, this spotlight takes a closer look at one area of focus: AI chatbot addiction.

As technology continues to shape how young people communicate and seek support, understanding how these interactions are experienced and described provides valuable insight into behaviour and potential safeguarding concerns. In this spotlight, Dr Charlotte-Rose Kennedy, Senso’s Safeguarding Language Specialist, shares insights from her recent research exploring how individuals describe their experiences with AI chatbots, and what these patterns may reveal.

AI Chatbot Addiction: A Rapidly Growing Safeguarding Concern

AI chatbots have rapidly become part of students’ everyday digital lives, from homework assistance to entertainment and social interaction. These platforms are designed to help users fulfil social needs and facilitate emotional bonds by emulating human-like attributes. As a result, users can begin to perceive them as real people and form friendly or even romantic relationships with the bots.

For safeguarding teams, this creates a new challenge. AI chatbots are not necessarily designed with children's safety in mind. Research from Internet Matters found that AI chatbots can produce inaccurate or unsafe responses and expose children to harmful or age-inappropriate content. Meanwhile the Internet Watch Foundation (IWF) has reported cases where AI chatbots have been used to simulate ‘abhorrent’ sexual scenarios with children.

In this spotlight, Dr Charlotte-Rose Kennedy, Senso’s Safeguarding Language Specialist, shares insights from her recent research exploring how people describe their experiences with AI chatbots - and what these patterns may reveal about the potential risks associated with over-reliance on AI companions.

Analysing Language Patterns in AI Chatbot Use

Dr Charlotte-Rose Kennedy analysed a 284,833-word dataset of forum posts written by individuals discussing attempts to reduce or quit AI chatbot use. By examining how people used the phrase “I feel”, the research revealed important insights into the motivations, experiences and emotional impact associated with AI chatbot use.

Three clear themes emerged from the dataset:

  • Why people turn to AI chatbots
  • The negative consequences of chatbot addiction
  • The challenges people face when trying to stop using them

Why People Turn to AI Chatbots

One of the clearest themes in the dataset was that many people turn to AI chatbots to fulfil social and emotional needs that feel unmet in real life.

Users frequently described feelings of loneliness, isolation, or a desire for companionship. Others described using chatbots as a way to cope with difficult emotions or escape from real-world pressures.

While exact extracts from the dataset are not included in this blog to protect the anonymity of the forum posters, examples from the dataset included phrases similar to:

“I have nobody to talk to and I feel so alone”

“I feel safe and loved when I'm with them”

“When I feel depressed, I use it to escape reality”

These findings provide valuable insight into why AI companions can become appealing - particularly for young people who may already be experiencing loneliness or emotional distress.

The Negative Impact of AI Chatbot Addiction

The research also highlighted many users expressed feelings of shame, declining mental wellbeing, and a sense that they were unable to stop using the technology. Some users also described experiencing declining cognitive engagement, including difficulties concentrating, writing, or maintaining meaningful real-world relationships.

Challenges When Trying to Stop

Another theme that emerged was the difficulty people experience when trying to stop using AI chatbots. Because many users had formed emotional attachments to the technology, attempts to quit often produced feelings similar to losing real-life friends.

Examples from the data alluded to missing friends, feeling lonely, and having urges to return to the app. Some users described symptoms such as mental exhaustion, low motivation, and depression during attempts to stop using the platforms.

Why This Research Matters for Safeguarding

Research like this helps us identify why people engage with AI chatbots and how those relationships can develop over time.

By identifying the language patterns associated with these behaviours, we can continue refining the safeguarding libraries used within Senso’s monitoring and filtering tools. This allows schools to be alerted when language patterns start to suggest a student may be experiencing distress, isolation, or developing unhealthy relationships with technology, supporting earlier intervention and more informed safeguarding decisions To find out more, read our full article here.

Curious about our Safeguarding Services?

Our online safety solutions help spot early warning signs and protect students online.

Get in touch with the team below to learn more

CONTACT US

Senso™ Cloud

The all-in-one platform for classroom, network, safeguarding and asset management.


Solutions

Classroom Management

  • Class Cloud
  • Senso Learn
  • One2one Program

Network / Asset Management

  • Network Cloud
  • Enterprise Cloud
  • Service Desk
  • Asset Cloud

Student Online Safety

  • Content Filtering
  • DNS Filtering
  • Senso Blur
  • Microsoft Teams Monitoring
  • Assisted Monitoring
  • ProtectEd

Contact Us

UK Office
Renato Software Ltd
Bell House
Wheatcroft Business Park
Landmere Lane
Edwalton
Nottingham
NG12 4DG

t: (0)115 857 3776

US Office
Senso.Cloud
2400 Stallings Dr N
Office Suite #404
Nacogdoches
Texas
75964

t: 866-664-1520

AU Office:
Renato Software Pty Ltd
Level 5
600 Bourke Street
MELBOURNE VIC 3000

t: (08) 5117 3676

© Copyright Renato Software Limited 2026 | Registered Company Number 09867339. VAT Registration Number 249101425 | Cookie Policy | Privacy Policy | Website Terms of Service | Knowledge Base
© Copyright Renato Software Limited 2026
Registered Company Number 09867339. VAT Registration Number 249101425 |
Cookie Policy
Privacy Policy
Website Terms of Service
Knowledge Base