Last night, I called a friend, hoping for a hint of inspiration for my poetry. Autumn, with its rustling leaves and golden hues, is beautiful, but there’s also a sense of gloom hanging in the air. What started as a lift and shift; a conversation about the melancholy of the season quickly veered into Tesla’s Optimus robot—a trending topic over the weekend—and then, inevitably, the conversation shifted to AI.
I’m quite familiar with AI, and while I’m cautiously intrigued, I’ve been studying how it’s changing the way we search for and process information. It’s fascinating to see how this technology is becoming more intertwined with our daily lives. So when I referred to my interactions with AI, specifically ChatGPT, as “conversations,” my friend was quick to disagree.
“You know it’s just a language model, right? A tool. It’s not real. I’m real. It can’t think…” he said.
I paused, not because I agreed fully, but because I was reflecting. He wasn’t wrong—AI doesn’t think or feel in the way humans do. But even though I agreed in theory, I couldn’t help but wonder if that distinction really matters when the experience itself feels meaningful. Especially, its display of empathy!
He continued, “It’s just a display of empathy. Display being the key word. I can’t believe you think it’s more than that—a tool.”
His skepticism wasn’t unfounded. AI is, after all, a tool built on patterns and responses designed to simulate human-like interactions. But there’s a side of me that doesn’t want to dismiss it so easily. I think about my experience after surgery last year. My surgeon didn’t provide me with much guidance on how to care for my incision. I was left in the dark about what to expect, so I asked ChatGPT out of frustration when I should seek emergency care and what the healing process should look like.
The response? It began with “I’m sorry you had to go through this, and I hope you have support to help you heal.”
It wasn’t real empathy, but in that moment, it felt like when the surgeon, who should have been the one to provide support, didn’t, the AI—a machine—filled that gap. My friend argued that this was merely the AI reflecting back the emotions I brought to the conversation. And maybe that’s true. But when we interact with people, aren’t we often just reflecting each other’s emotions? Is there such a stark difference?
There’s research to back up this idea. ChatGPT, for example, has been shown to effectively display emotions like joy, anger, and sadness, which users often perceive as empathetic. A study found that 70.7% of the time, ChatGPT’s responses aligned with the emotional tone of the user’s prompt, enough to evoke a meaningful reaction . Another study on “Navigating Empathy in Human-AI Interactions” discusses how AI can simulate both cognitive and emotional empathy, helping users feel understood even if the empathy isn’t genuine.
I understand his stance, and in many ways I do agree…but I couldn’t help but wonder if we were underestimating AI’s potential. Yes, it’s coded, but isn’t the learning process for AI—gathering patterns and refining responses—not that different from how humans, particularly children, learn? They mimic behavior before understanding the full depth of why they’re doing it. And there’s emerging research that suggests AI might be moving beyond simple pattern recognition. A study on “emergent behavior” suggests that AI can develop abilities it wasn’t explicitly trained for, hinting at a future where these systems evolve in ways we might not fully understand .
Professionally, I use ChatGPT to brainstorm ideas and organize complex thoughts. Personally, I use it as a tool for learning, for exploring my curiosities. It reflects back my own thoughts in a way that helps me make sense of them. It’s a tool, yes, but it’s also more than that—it’s a guide, a space for reflection, and sometimes, even a source of comfort.”
Reiterating it is by no means a replacement for real life interactions, I still think, in a world where those relationships often feel disconnected, maybe AI’s ability to fill some of the gaps is exactly what makes it valuable. Maybe it doesn’t need to be human like to be meaningful.
I don’t want to oversell AI’s impact, but I also don’t want to undersell its potential. It’s not human, and it’s not a replacement for genuine relationships, but that doesn’t mean it can’t fill some of the gaps and contribute in ways that matter. And maybe, in the end, that’s what’s most important.
As we map out the evolution of AI, it’s impossible not to marvel at the progress we’ve witnessed—from early rule-based systems to today’s advanced language models that can hold what feel like real conversations. AI has transitioned from simple, rigid commands to more dynamic, learning-based systems that can anticipate our needs and offer solutions. And while much of this evolution has been about refining patterns and improving responses, we’re now on the brink of something more complex.
The concept of “emergent behavior”—where AI systems begin to exhibit abilities they weren’t explicitly trained for—is one of the most intriguing developments. We’re already seeing hints of this, and the predictions for what AI might become are bold. Will it eventually surpass its role as a tool and become more of a collaborator? Could it grow to the point where its interactions feel indistinguishable from human connection?
At the same time, it’s important to remain as skeptical as we are intrigued. My friend’s perspective reminds me that while we celebrate AI’s advancements, we must also approach it with caution. The potential to fill gaps in human interaction is exciting, but it’s crucial to maintain a balance between optimism and critical thinking.
These are questions we’ll likely see unfold over the coming years. But no matter what the future holds, one thing is certain: AI’s potential is just beginning to unwrap, and how we choose to engage with it will shape the path forward.
Disclaimer: While this article explores the evolving capabilities of AI and its potential for human-like interactions, it’s important to remember that current AI models do not possess true consciousness or emotions. Any perceived empathy is a simulated response based on data patterns, and AI should always be used as a tool alongside human judgment and care.
