AI usage and loneliness — March 22, 2026

AI usage and loneliness

For many people, AI chatbots are becoming more than tools — they’re becoming companions. A recent study published in the International Journal of Human–Computer Studies explored whether socially disconnected high‑school students are turning to chatbots for friend‑like conversations, and what that might mean for their well‑being.


What Was the Study?

Researchers from Aarhus University surveyed 1,599 Danish high‑school students across 14 institutions, to understand:

  • How many students talk to general purpose AI chatbots the way they would talk to a friend
  • Why they do it
  • Whether loneliness or low social support predicts this behavior

The study used a mixed‑methods design, combining survey data with qualitative analysis of students’ open‑ended responses.

 


What Were the Results?

1. About 14.6% of students (234 teens) reported chatting with an AI chatbot “in the same way one would otherwise chat with a friend” in the past month.

Of the 14.6 % students using general purpose AI, (Dec 2023 to March 2024 versions):

  • 69.2 % ChatGPT
  •  40.2% Snapchat MyAI
  • 4.7% Character.ai
  • 1.3% Replika
  • 3.4% Other

2. Students tended to use AI chatbots in two distinct ways:

  • Utilitarian use — information, homework help, task support
  • Social-supportive use — emotional expression, coping with loneliness, venting

3. Compared to non‑users and utilitarian users, students who used chatbots for emotional support reported:

  • Higher loneliness (d = 0.53)
  • Lower perceived social support (d = –0.46)

These students were the most socially disconnected group in the sample.

4.  Students were more likely to turn to AI chat when they felt:

  • Lonely
  • In a bad mood
  • A desire to self‑disclose

Interestingly, emotional distress predicted usage NOT feeling friendship toward the AI chatbot.


Study limitations:

  • This study looked at usage of general purpose AI  (Chat GPT, Snapchat MyAI, Character AI, Replika, others) using versions of AI in 2023-24, this may limit how broadly results apply to versions of general purpose and mental health specific AI’s available today; as products and features continue to evolve rapidly.
  • Small sample size in 1 geographic area
  • Self-reported data of active chatbot users which may limit accuracy (reporting bias, errors in reporting)
  • The study design was cross sectional, which might not show cause and effect
  • Family environment, offline behaviors and support were not effectively considered
  • Further research is needed to better understand this issue.

What Does This Mean?

  • This study found that some lonely young people are using AI chatbots as a coping strategy, especially when they feel unsupported or disconnected from peers.
  • AI programs can feel safe, nonjudgmental, and always available — but they cannot replace the depth and reciprocity of human relationships.
  • For students who already feel isolated, relying heavily on chatbots may unintentionally deepen social withdrawal.
  • This doesn’t mean chatbots are “bad.” It means we need to pay attention to why and how we are using them, and helping to identify more effective sources of connection.

Strategies for Healthy Digital Connection

    • Notice your patterns: Are you turning to chatbots mostly when you feel lonely or overwhelmed?
    • Reach out to trusted people: Even brief conversations with friends, family, or mentors can strengthen real-world connectedness.
    • Use chatbots intentionally: They can help you brainstorm, organize thoughts, or practice communication, identify trends, track and improve health behaviors — but shouldn’t be your emotional outlet or therapist.
    • Build offline routines: student organizations, sports, study groups, and hobbies create natural opportunities for connection.
    • Do not use for emergencies or crises.
    • Talk to a mental health professional: If loneliness feels persistent or overwhelming, a counselor can help you navigate it.

As with other digital tools, it matters how and why AI is used. Some practical considerations include:

 


By Ryan S Patel DO, FAPA
Psychiatrist and Director of College Psychiatry, The Ohio State University, Counseling and Consultation service,
Contact for speaking, training, comments: ryanpatel9966@gmail.com

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes. With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.


References

  1. Herbener, A. B., & Damholdt, M. F. (2025). Are lonely youngsters turning to chatbots for companionship? The relationship between chatbot usage and social connectedness in Danish high-school students. International Journal of Human–Computer Studies.
  2.  Additional citations from the article included in-text above.
  3. Patel R. Mental Health For College Students Chapter 8. Technology, media, and mental health.
Is depression linked to AI for personal use? — February 9, 2026

Is depression linked to AI for personal use?

Generative artificial intelligence (AI) tools—are powerful tools with new products and features becoming increasingly available.  We are increasingly using AI for work, school, and personal use.

A recent study in JAMA Network Open looked at whether this impacted depression symptoms, and is one of the first large‑scale looks at this emerging issue (1).


What Was the Study? (1)

Researchers conducted a U.S. nationwide internet survey between April and May 2025, analyzing responses from adults across all 50 states (1)

  • 20,847 adults, ages 18 and older
  • Participants self‑reported:
    • Frequency of generative AI use
    • Use of social media
  • Depressive symptoms were measured using the PHQ‑9, a widely used clinical screening tool for depression
  • Data were analyzed in August 2025

The goal was to understand whether frequency of AI use was associated with higher levels of negative affect, independent of other factors.


What Were the Results? (1)

Generative AI use was common but varied widely:

  • 10.3% of U.S. adults reported using generative AI daily
  • 5.3% reported using AI multiple times per day
  • Daily users most commonly reported:
    • Work‑related use (48%)
    • Personal use (87%)
    • Smaller proportions used AI for school

When mental health outcomes were examined:

  • Daily or more frequent AI use was associated with higher depressive symptom scores; in this sample, it was mainly for personal use (not school or work)
  • Adults who used AI daily had approximately 30% greater odds of at least moderate depression
  • The association was strongest among younger adults, compared with older age groups

 

What are some caveats?

  • This was a cross-sectional study which shows a snapshot but cause and effect.
  • Although our results are consistent with personal AI use causing greater depressive symptoms, they are equally consistent with greater depressive symptoms precipitating greater AI use, or with neither of these.
  • The study did not account for ither confounding effects, such as preexisting psychiatric diagnoses.

What Does This Mean?

This study does not suggest that generative AI is inherently harmful. Instead, it raises important questions about howwhy, and by whom these tools are being used.

Possible explanations for the observed association include:

  • People experiencing depression may be more likely to turn to AI tools
  • Heavy AI use could displace social interaction, sleep, or restorative activities
  • AI use may reflect broader patterns of screen time, isolation, or stress
  • Future research is needed to clarify mechanisms, directionality, and individual differences in how AI use relates to mental health (1)

What Does This Mean for Everyday Life (AI and mental health safety guidance)?

As with other digital tools, it matters how and why AI is used. Some practical considerations include:

 


By Ryan S Patel DO, FAPA
OSU‑CCS Psychiatrist

If you would like to be notified of a new post (usually once per month), please subscribe,  its free.

For speaking engagements, keynotes, seminars, etc contact: ryanpatel9966@outlook.com

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes. With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.


Reference

  1. Perlis RH, Gunning FM, Usla A, et al. Generative AI Use and Depressive Symptoms Among US Adults. JAMA Network Open. 2026;9(1):e2554820. doi:10.1001/jamanetworkopen.2025.54820
  2. Patel R. Mental Health For College Students Chapter 8. Technology, media, and mental health.
Using Google Gemini AI for mental health support: benefit vs limitations — November 30, 2025

Using Google Gemini AI for mental health support: benefit vs limitations

Sometimes students turn to AI for mental health support but this is NOT a replacement for professional treatment, it is NOT intended for emergencies, and NOT for therapy.  Using AI for mental health is not without risks (noted below). A recent article discusses considerations for using Google Gemini AI for mental health (1).

Combining traditional wellness practices with digital tools like Google Gemini can provide more accessible, personalized support.

Here are some examples (1):

Step Action How Google Gemini Helps
1 Assessment & Baseline: Reflect on current emotional wellbeing habits, stress levels, sleep, etc. Gemini can help you create baseline surveys, interpret wearable data.
2 Set Goals: Specific, measurable, realistic mental health goals (e.g. reduce anxiety, improve sleep). Gemini can suggest goal‑setting frameworks, help you refine goals.
3 Plan Interventions: Choose from the practices above that suit you. Gemini can help pick appropriate interventions; schedule reminders.
4 Tools & Resources: Apps, guided meditations, wearable trackers. Gemini can help you identify additional resources that may be helpful.
5 Monitor & Iterate: Track your progress; note what works, what doesn’t. Gemini can analyze your logs, suggest adjustments.
6 Support Network & Professional Help: Use community,  professional therapy, peer support when needed. Gemini can help you locate local professionals, support groups, create checklists for sessions.  For mental health support options at OSU, go here: https://ccs.osu.edu/services/mental-health-resources
 

 

Using Google Gemini to enhance 10 Evidence‑Based Practices that support Mental health (1)

1 Mindfulness and Meditation such as seated meditation, body scan, mindful breathing.

  • Benefits: Reduced stress and anxiety, improved emotional regulation, increased attention span.
  • AI / Google Gemini’s role: Can generate personalized guided meditations; suggest mindfulness prompts; help analyze meditation logs; recommend apps, practices based on user’s mood.

2 Physical activity such as Aerobic exercise, strength training, yoga, etc.

  • Benefits: Releases endorphins; improves mood; reduces symptoms of depression and anxiety.
  • How to do it: Regular routine (e.g., 30 mins, 3‑5 times/week); choose types you enjoy.
  • AI / Google Gemini’s role: Reminders, custom workout plans; tracking progress; motivating messages; adapting plan based on feedback.

3 Adequate Sleep Hygiene such as consistent schedule, avoiding caffeine at night, limiting screen time before bed.

  • Benefits: Better mood, improved cognitive function, reduced risk of mental health disorders.
  • How to do it: Set regular wake/sleep times, create sleep‑friendly bedroom, avoid blue light at night.
  • AI / Google Gemini’s role: Suggest improvements; analyze sleep trackers; recommend sleep routines; issue alerts when patterns deteriorate.

4 Balanced Nutrition with whole foods: vegetables, fruits, lean proteins, healthy fats; reducing processed foods.

  • Benefits: Affects brain health (neurotransmitters); energy stability; mood stabilization.
  • How to do it: Meal planning; nutrition; hydration.
  • AI / Google Gemini’s role: Suggest recipes; smooth meal planning; help adjust nutrition to lifestyle; track nutritional deficiencies.

5 Social Connection such as Maintaining friendships, community engagement, supportive relationships.

  • Benefits: Lower rates of depression and anxiety; buffer stress; improve wellbeing.
  • How to do it: Regular catch‑ups; joining interest groups; volunteering; quality time.
  • AI / Google Gemini’s role: Reminders to reach out; suggest local groups; help draft messages; coach on communication skills.

6 Cognitive Behavioral Techniques such as reframing negative thoughts, behavioral activation.

  • Benefits: Strong evidence in reducing depression, anxiety; improving resilience.
  • How to do it: Identify negative thinking patterns; set small behavioral goals.
  • AI / Google Gemini’s role: Guide journaling; challenge unhelpful thoughts; suggest techniques.

7 Journaling and Reflective Practices such as Writing down thoughts, gratitude journaling, reflection on daily experience.

  • Benefits: Helps process emotions; increases self‑awareness; reduces rumination.
  • AI / Google Gemini’s role: Provide prompts; analyze themes over time; offer feedback; suggest reflection questions.

8 Limiting Screen Time & Digital Detox, especially social media or negative content; periodic breaks.

  • Benefits: Improves sleep, reduces anxiety, improves concentration.
  • How to do it: Set screen‑free hours; remove apps; use blue‑light filters; substitute with offline activities.
  • AI / Google Gemini’s role: Monitor usage; suggest schedule for detox; send reminders; provide alternative offline ideas.

9 Nature Exposure such as Time outdoors, green spaces, forests, parks, natural light.

  • Benefits: Reduces stress; improves mood; improves attention; sometimes lowers blood pressure.
  • How to do it: Daily walks; gardening; sitting outside; weekend hikes.
  • AI / Google Gemini’s role: Suggest nearby parks; remind to get outside; provide information on nature therapy; support tracking nature exposure.

10 Professional Support & Therapy: Talking to mental health professionals (therapists, psychologists, psychiatrists), possibly medication if needed.

  • Benefits: Tailored treatment; long‑term improvement; skills development.
  • How to do it: Seek licensed professional; assess online therapy options; ensure credentials; set expectations.
  • AI / Google Gemini’s role: Provide information on finding providers; clarify what therapy entails; prepare questions; help understand treatment options; supplement (not replace) professional help. https://ccs.osu.edu/services/mental-health-resources

Potential Risks and Ethical Considerations (1)

AI‑assisted mental wellness has promise, but also comes with risks, so being aware can help in using such tools safely.

  • Accuracy & Hallucinations: As studies show, models including Gemini may sometimes produce incorrect or misleading outputs. For medical or mental health matters, this can be harmful.
  • Privacy & Data Security: Mental health data, sensor data, journal entries are highly sensitive. Ensuring secure storage, consent, transparency in use is crucial. Understand terms and conditions and avoid entering private, confidential, individually identifiable information whenever possible.
  • Overreliance on AI / Self‑Diagnosis: Tools should support, not replace, professional help. Self‑managing with AI alone might delay getting necessary care.
  • Bias and Culture: Mental health concepts and practices are culture‑sensitive. What works in one region might not be valid in another. AI trained on biased datasets may misinterpret nonwestern expressions of distress.
  • Ethical / Regulatory Compliance: Data protection laws (e.g. HIPAA, GDPR), professional guidelines, licensing issues for digital health tools must be respected. Take time to familiarize your self with these features/ data limitations of the AI you are using.
  • Limitations: Limitations include potential inaccuracies, lack of emotional nuance, data privacy concerns, and an inability to provide licensed therapeutic interventions.

Additional resources: For mental health support options at OSU, go here: https://ccs.osu.edu/services/mental-health-resources

By Ryan S Patel DO, FAPA
OSU-CCS Psychiatrist
Contact: ryanpatel9966@outlook.com

Disclaimer: This article is intended to be informative only. It is advised that you check with your own physician/mental health provider before implementing any changes.  With this article, the author is not rendering medical advice, nor diagnosing, prescribing, or treating any condition, or injury; and therefore claims no responsibility to any person or entity for any liability, loss, or injury caused directly or indirectly as a result of the use, application, or interpretation of the material presented.

References:

  1. https://www.quickobook.com/healthfeed/view/how-google-gemini-is-transforming-mental-wellness-10-proven-ways-to-improve-your-mind-and-mood