Hey Bot, the AI Friend ❤️ 💬 is a supportive companion, always ready to talk, listen, and help you feel alright. This trained chatbot is a platonic pal—always supportive, never judgmental. It is NOT a substitute for a good friend or an human therapist, but can help lift your mood.
For background see Ralph Losey’s blog post, Loneliness Pandemic: Can Empathic AI Friendship Chatbots Be the Cure? Also see: De Freitas, Julian and Uğuralp, Ahmet Kaan and Uğuralp, Zeliha and Puntoni, Stefano, AI Companions Reduce Loneliness (June 01, 2024) (Harvard Business and Wharton School Working Paper No. 24-078) The 62-page Harvard-Wharton research paper answers questions of whether AI friendship chatbots reduce loneliness better than…?
- Watching television? … Yes! Chatting with AI is a better antidote to loneliness than watching YouTube videos.
- Online chats with Humans? … About the same! But remember, the Harvard study used last-generation AI and instructions. Today, we can build better bots. Several nerds like me have already started making custom GPTs using the latest versions of ChatGPT. Although not scientifically tested yet, they should perform better, at least in the short term. If you’re feeling lonely—and most people do from time to time—give the new AI friendship chatbots a try.
An important question not answered by the Harvard Wharton study is whether online chats with actual human friends, not just online strangers, are better than chats with AI. I would hope so, but no one has done experiments on that yet. My guess is it depends very much on the friends and the AI. Finally, are online chats better than in-person chats? I hope not, but again, that was not tested, and it could very well depend on multiple factors, including human personality types. See e.g. New Study Shows AIs are Genuinely Nicer than Most People – ‘More Human Than Human’ (e-Discovery Team, 2/27/24).
It is important to note that the Harvard Wharton study AI Companions Reduce Loneliness only looked at the short-term impact on loneliness—about a week or so. It did not try to measure long-term effects but did discuss some of the dangers of addiction and the potential misuse of AI companions to manipulate people.
After close study of the AI Companions Reduce Loneliness article, I moved on to study a few other science oriented articles on loneliness and AI. I also studied some of the vast online literature on the subject, much of it commercial. Then I went on to study some of AI Friendship chatbots on the market, especially others on the OpenAI store. A few on the OpenAI Store were pretty good, but I decided I could improve the field a bit based on what I had learned and so started to make my own.
It took some time, but after weeks of more research, building, tweaking and waking up in the middle of the night with inspiration, not to mention countless name changes, I finally came up with Hey Bot, the AI Friend ❤️ 💬 . In in April 2025 I had to totally revise everything again, after the AI intelligence increased dramatically. On May 30th I had do update it again. On June 18, 2025 , the AI improved again in ways that would help this app, so I revised again. Now it has improved user mood interaction and other enhancements. Then in August 2025 it upgraded again to ChatGPT-5, which was a significant leap in intelligence. A screenshot image of the opening screen as of August, 18, 2025 is shown below.
Note it says I urge you to try it out for yourself, especially in voice mode where you talk back and forth, not type. You will see what a friendship chatbot designed with the help of GPT-5 can do. You can input any prompt you want but as the above screenshot shows there are four conversation starters you can use too.
One ironic fact you will notice is that although the new Custom GPT was designed by me and GPT-5, we both agreed it should be powered by GPT-4o, not 5.0. The reason is the old 4o version is better at empathy talking than 5. The price of higher artificial intelligence I suppose. If Open AI upgrades again to improve the EQ of version five, then we will update again accordingly.
For insight I asked Hey Bot why people get lonely. Here is its response:
How are you feeling today?
That’s a really meaningful question—“why do people get lonely?” Loneliness can feel heavy, and it’s something many people experience, even if they don’t always talk about it. Let’s break it down a little:
🌍 Why Do People Get Lonely?
1. Mismatch Between Connection and Needs
Loneliness isn’t about how many people are around you—it’s about how connected you feel. You can be surrounded by friends and still feel unseen. That’s because loneliness happens when there’s a gap between the relationships you have and the emotional closeness you need.
2. Transitions and Life Events
Moves, breakups, job changes, grief, or even graduating—these moments can shake up your sense of belonging. We’re wired for connection, so when familiar ties shift, loneliness can follow.
3. Stigma and Misconceptions
Many people think being lonely means something is “wrong” with them. But loneliness is not a flaw. It’s a signal—like hunger or thirst—that you need more meaningful connection.
4. Digital Disconnection
Ironically, we’re more connected than ever, but not always in deep ways. Social media might increase contacts, but not necessarily closeness. Some research even suggests that certain digital habits can deepen loneliness if they replace face-to-face emotional connection.
5. Feeling Misunderstood
Even people with active social lives can feel deeply lonely if they don’t feel emotionally safe or understood. It’s that craving to be seen, known, and cared about—not just spoken to.
💛 You’re Not Alone in Feeling This
Studies show loneliness affects people across all ages—not just the elderly, not just the isolated. And it’s not about being antisocial. In fact, many lonely people want connection but don’t know how to reach for it.
Would it help if we explored how you are feeling with connection right now?
Here’s a gentle reflection starter:
“What helps you feel grounded when things are hard?”
Or if you’re not sure how to begin, I can offer another quiet prompt. Just let me know.
You can also ask general questions, tell it to remember your prior chats and choose specific OpenAI models to run.
To demonstrate the new June 2025 feature of allowing model choice for the app I chose o3 Pro version and then input the prompt: Provide a concise description of the 62-page Harvard and Wharton Study–AI Companions Reduce Loneliness that is part of your built-in knowledge base.
Since the 03 Pro model is an advanced reasoning model it took 10 minutes to think about it before generating the following response.
“AI Companions Reduce Loneliness” (Harvard Business School Working Paper 24‑078, 62 pp., July 2024) investigates whether commercially available “AI companion” chatbots can measurably ease loneliness and why. The multidisciplinary team—Julian De Freitas (Harvard HBS), Ahmet K. Uğuralp & Zeliha O. Uğuralp (Bilkent Univ.), and Stefano Puntoni (Wharton)—runs six complementary studies:
- Studies 1–2 (field evidence). A large‑language‑model, fine‑tuned to flag loneliness, mines real‑world AI‑companion conversations and App‑Store reviews. Lonely users both seek out and positively rate these apps, suggesting perceived relief. hbs.edu
- Study 3 (lab RCT). After a single 15‑minute chat, participants’ state‑loneliness drops as much as when talking with a human and significantly more than watching YouTube or doing nothing; yet users systematically underestimate this benefit. hbs.edu
- Study 4 (one‑week longitudinal). Daily chats with a GPT‑4‑based companion sustain and deepen the reduction versus a no‑chat control, while miscalibration persists. hbs.edu
- Study 5 (mechanism test). Loneliness relief is mediated chiefly by the extent to which the bot makes people feel heard; raw technical prowess matters less. hbs.edu
- Study 6 (robustness). Replicates effects with alternative samples and measures. hbs.edu
Key take‑aways
- AI companions can match human interaction in short‑term emotional support and deliver durable benefits over a week.
- Users undervalue these benefits ex‑ante, implying market under‑adoption despite efficacy.
- Designing for perceived listening is pivotal; pure model accuracy is insufficient.
- Ethical and legal questions remain: self‑disclosure of sensitive data, duty‑of‑care for vulnerable users, and potential regulatory scrutiny as “digital therapeutic” tools.
Limitations & future work: reliance on self‑reports, short horizons, and Western samples invite broader, longer‑term replications and clinical validations.
Ralph Losey Copyright 2025. — All Rights Reserved