For many people, AI chatbots are becoming more than tools — they’re becoming companions. A recent study published in the International Journal of Human–Computer Studies explored whether socially disconnected high‑school students are turning to chatbots for friend‑like conversations, and what that might mean for their well‑being.


What Was the Study?

Researchers from Aarhus University surveyed 1,599 Danish high‑school students across 14 institutions, to understand:

  • How many students talk to general purpose AI chatbots the way they would talk to a friend
  • Why they do it
  • Whether loneliness or low social support predicts this behavior

The study used a mixed‑methods design, combining survey data with qualitative analysis of students’ open‑ended responses.

 


What Were the Results?

1. About 14.6% of students (234 teens) reported chatting with an AI chatbot “in the same way one would otherwise chat with a friend” in the past month.

Of the 14.6 % students using general purpose AI, (Dec 2023 to March 2024 versions):

  • 69.2 % ChatGPT
  •  40.2% Snapchat MyAI
  • 4.7% Character.ai
  • 1.3% Replika
  • 3.4% Other

2. Students tended to use AI chatbots in two distinct ways:

  • Utilitarian use — information, homework help, task support
  • Social-supportive use — emotional expression, coping with loneliness, venting

3. Compared to non‑users and utilitarian users, students who used chatbots for emotional support reported:

  • Higher loneliness (d = 0.53)
  • Lower perceived social support (d = –0.46)

These students were the most socially disconnected group in the sample.

4.  Students were more likely to turn to AI chat when they felt:

  • Lonely
  • In a bad mood
  • A desire to self‑disclose

Interestingly, emotional distress predicted usage NOT feeling friendship toward the AI chatbot.


Study limitations:

  • This study looked at usage of general purpose AI  (Chat GPT, Snapchat MyAI, Character AI, Replika, others) using versions of AI in 2023-24, this may limit how broadly results apply to versions of general purpose and mental health specific AI’s available today; as products and features continue to evolve rapidly.
  • Small sample size in 1 geographic area
  • Self-reported data of active chatbot users which may limit accuracy (reporting bias, errors in reporting)
  • The study design was cross sectional, which might not show cause and effect
  • Family environment, offline behaviors and support were not effectively considered
  • Further research is needed to better understand this issue.

What Does This Mean?

  • This study found that some lonely young people are using AI chatbots as a coping strategy, especially when they feel unsupported or disconnected from peers.
  • AI programs can feel safe, nonjudgmental, and always available — but they cannot replace the depth and reciprocity of human relationships.
  • For students who already feel isolated, relying heavily on chatbots may unintentionally deepen social withdrawal.
  • This doesn’t mean chatbots are “bad.” It means we need to pay attention to why and how we are using them, and helping to identify more effective sources of connection.

Strategies for Healthy Digital Connection

    • Notice your patterns: Are you turning to chatbots mostly when you feel lonely or overwhelmed?
    • Reach out to trusted people: Even brief conversations with friends, family, or mentors can strengthen real-world connectedness.
    • Use chatbots intentionally: They can help you brainstorm, organize thoughts, or practice communication, identify trends, track and improve health behaviors — but shouldn’t be your emotional outlet or therapist.
    • Build offline routines: student organizations, sports, study groups, and hobbies create natural opportunities for connection.
    • Do not use for emergencies or crises.
    • Talk to a mental health professional: If loneliness feels persistent or overwhelming, a counselor can help you navigate it.

As with other digital tools, it matters how and why AI is used. Some practical considerations include:

 


By Ryan S Patel DO, FAPA
Psychiatrist and Director of College Psychiatry, The Ohio State University, Counseling and Consultation service,
Contact for speaking, training, comments: ryanpatel9966@gmail.com

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes. With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.


References

  1. Herbener, A. B., & Damholdt, M. F. (2025). Are lonely youngsters turning to chatbots for companionship? The relationship between chatbot usage and social connectedness in Danish high-school students. International Journal of Human–Computer Studies.
  2.  Additional citations from the article included in-text above.
  3. Patel R. Mental Health For College Students Chapter 8. Technology, media, and mental health.