The Authentic Algorithm: Can AI Enhance or Hinder Who You Really Are?

Welcome to The Original Self Podcast. I’m Evet DeCota, owner of DeCota Life Coaching. I’m a psychology-informed life coach exploring resilience, mindset, and the courage to become your authentic self. This is a space for honest conversations about growth, identity, relationships, and all the messy moments in between that shape who we become. Whatever brought you here today, you’re in the right place

 I’ve noticed over the years that, standing behind a salon chair and coaching individuals, people rarely come in for just a hair service or a single topic. They come to me carrying doubts, fears, dreams, and questions about who they are becoming.

Over time, I’ve realized that beneath all the noise of expectations, criticism, and life’s pressures, there is a version of ourselves that still exists. Let’s explore The Authentic Algorithm: Can AI enhance or hinder who you really are?

Over the last month or so, I have been speaking to clients and listening to podcasts and news about how AI is taking over just about everything. Advertising and marketing to news, songs, and audiobooks, and to the obvious AI-generated content flooding social media reels and stories. It’s fun to watch media reels of dogs dancing in synch or barking out a sentence, yet no one is fooled by it. But watching celebrities, athletes, or politicians say something they never said, voices and mouths perfectly matched, is where I start to feel unsettled. People fall for these falsities, and as misinformation has shown us over the last decade, when we start to believe a false narrative, a small divide becomes a chasm between people, viewpoints, and feelings.

I have a heightened concern about businesses increasingly pressured by wealth and power to abandon any guardrails that companies may want to install on the use of AI. I worry about American jobs becoming extinct in a mere decade or less. What would we do for money? What would we do with our time? How will we learn new things that make our minds and bodies stronger? How will we connect, either empathetically or in real life? Will we all become like the characters in the Apple TV series Plur1bus, where AI merges all of humanity into a single collective consciousness, peaceful on the surface, but stripped of every individual thought and impulse that makes us who we are?

And then a thought turned me inward. Hadn’t I used AI for prompts, spelling, and grammar checks? Didn’t I rely heavily on AI to build the first rendition of my website because the whole field was new to me, and I felt very unsure about my ability to communicate who I was and what I wanted to present to the world? I wondered whether using AI would quietly erode everyone’s creativity and critical thinking abilities, and whether leaning on it the way some people lean on an emotional support dog for comfort, rather than capability, might do more harm than good.

In case you have been living under a rock in the last few years, AI, at its core, is a technology trained from an enormous amount of human-generated information, designed to learn from it, reason with it, and communicate in ways that can feel surprisingly human. It doesn’t feel, intuit, or originate. That distinction matters enormously when we are talking about authenticity.

The first time I used AI, it seemed similar to a Google search, but now it feels like it’s a totally different beast. The more it starts to reason and communicate, the more my psychology-informed, curious brain hears the faint beep of a dying smoke alarm.

I’ve had many conversations with friends, family, and clients about how they use AI in their personal lives. A few have told me that they use AI as a makeshift therapist in times of need, and believe that it is better than actually speaking to a licensed professional therapist. As a coach, that kind of thinking made me do a deep dive to find out how empathetic can AI truly be? I found a study by researcher Victor Frimpong called Empathy and the Human-Moment Gaps of AI Chatbots: Insights from Empathy Displacement Theory

The study speaks to how psychologists describe empathy as having three dimensions. The first, Affective empathy, is the ability to emotionally connect with what another person is feeling, including unconscious automatic mimicry. The second, Cognitive empathy, is the capacity to understand someone else’s perspective. And the third is Compassionate empathy, which moves us to actually do something about another person’s suffering rather than simply observe it.

The thing about empathy, though, is that it develops through trust, timing, tone, and the kind of presence that tells another person they are truly being seen, which is exactly why it’s so difficult for technology to replicate. Empathy assumes that both people in the exchange have actually felt something.

Frimpong identifies three ways the absence of genuine AI empathy does not just leave a gap, but actively changes how humans experience empathy over time. The first absence of empathy in AI is called Affective Surfaceism, meaning that people begin to prefer the predictable comfort of a chatbot over the messier reality of human connection. The second absence is Memory Fragmentation, which shows that a lack of any relational history distorts how we value empathy in our human relationships. The third part is Moral Framing Mismatch, which showcases how organizations begin to prioritize efficiency over genuine care.

Together, these three gaps form the foundation of what Frimpong calls Empathy Displacement Theory. AI-simulated empathy doesn’t just fill the space where real empathy used to be. It gradually retrains us to accept the imitation as the real thing, until the most pressing question is no longer whether AI can care about us, but what happens to us when we become used to the version of care it offers.

So even though many feel truly heard, supported, and seen by the AI therapist, it’s not programmed, YET, to know how the person feels through their own similar experiences, and it can’t mimic facial expressions, gestures, or tone to allow the human on the other side of the screen to feel really connected.

Reflection Question:

Take a moment to consider this: Think about a time you felt genuinely heard by another person. Not just agreed with, but truly heard. Could that moment have happened with a chatbot? If your answer is no, what does that tell you about what you actually need from the people in your life?

On my quest to find out if AI enhances or hinders our real selves, I reached out to two people who live and breathe the world of AI. The first works for a major AI company whose mission centers on safety and honesty. The second is my brother Stefan, a strategic business advisor with 25 years of experience inside large startups and vast data companies, AI included in all of it.

My contact at the major AI company and I have talked many times about it, specifically around the eventual programming and processing of emotions by chatbots, which is his main job. I know, frightening images straight out of the film iRobot, of thousands of enraged anarchist robots standing on shipping containers ready to strike while plotting the ultimate takeover of the human population, come to my mind.

He told me that AI will get closer to replicating human emotion, but it will never be a one-to-one match, and oversimplifying that distinction is part of the problem. If some form of independent intelligence does emerge, it will have developed from an entirely different set of parameters than humans, with no microexpressions, no physiological cues, and no instinctive sensitivity to the subtle signals we read in one another constantly. He said that AI may actually surpass us in objective decision-making because of how efficiently it processes information, but it can’t gather emotional data the way a human does. As he put it, ‘I might say one thing, and my body language will tell a therapist something completely different. There is no way for AI to know that piece.”

Stefan sees AI as a force that gradually erodes self-creativity and breeds reliance, making people increasingly dependent on it across nearly every area of life. He also points to mounting evidence of how quickly it is eliminating jobs, and while he believes a bounce back will eventually come for those who learn to master it creatively, he warns that AI will largely wipe out what remains of the middle class and drive poverty rates significantly higher over the next one to three years simply by making entire categories of human work obsolete.

Reflection Question:

Listener, consider this: What is the one thing you do, either in your work or your personal life, that you believe only a human being could do? And how certain are you that it will still be true in five years?

Those are some of the ways AI can work against who we are. But can it also enhance our original selves?

A client recently told me she was struggling with her family and had turned to a chatbot for support. After several conversations, I warned her that AI is the ultimate people-pleaser and may not challenge your thoughts and beliefs the way a therapist or coach would. But what she described was something different. She was using it to journal, to identify her own thinking patterns and ruminations, and to get to the crux of her emotional distress so she could frame it clearly, both to herself and to a professional. She used it as a sounding board for deeper self-reflection and to clarify what her core values actually are. It even suggested a psychological concept that she completely identified with, which led her to a course that she is now working through. The way she uses it offers her a glimpse of her original self, and that, in my book, is a win.

Personally, I use it to get unstuck, whether it’s a word I dropped out because I type too slowly, a jumble of words that are a complete run-on sentence that could use a little help from the concise tab, or amplifying my message without diluting what I meant when I wrote it the first time. In coaching, or heck, all communication, the clearer I project, the quicker and deeper someone can reflect on the subject.

Reflection Question:

So, I ask you: Think about how you currently use AI, or how you might use it. Are you bringing your own thoughts and questions to it and letting it help you go deeper? Or are you handing it a blank page and asking it to fill it in for you? The answer to the last question matters more than you might think.

So where does that leave us?

AI is not going anywhere. It will get more sophisticated, more convincing, and more woven into the fabric of daily life than most of us are prepared for. The question is not whether we will use it, but whether we will use it with enough intention versus abdication to keep ourselves in the equation.

What I have come to believe, through my own experience, research, and through the people I have talked to while preparing this episode, is that AI becomes a problem for the Original Self the moment we ask it to think for us instead of with us. The moment we hand it our voice and call the result our own. The moment we turn to it for the kind of comfort and connection that is only built between two people who actually have lived experiences.

But when we bring our own ideas, our own questions, our own half-formed thoughts, and let AI help us shape them into something clear, that is a tool in service of the original self and not a replacement for it. The ladder, not the climber.

My brother Stefan, along with many scientists and economists, sees the devastating financial consequences coming; they are real, and they are very serious. My contact inside the AI industry sees the empathy gap widening, and the very possible ability to eventually learn emotions on its own terms, and that deserves our utmost attention.

So notice it now. Use the tool, but stay in the room with yourself while you do. Remember, AI can hold the ladder, but you have to do the climbing. The tool doesn’t make the work yours, your intention, your voice, and your willingness to show up and think does.

Reflection Question:

Final reflection question to contemplate: When you use AI, are you bringing yourself to it, or are you slowly letting it replace you?

Thank you for listening to The Original Self Podcast. If these reflections resonate with you and you’d like to explore your own growth, you can learn more about my coaching at decotalifecoaching.com.

Source:

Frimpong , V. (2025). Brain. broad research in artificial intelligence and Neuroscience​. Empathy and the Human-Moment Gaps of AI Chatbots: Insights from Empathy Displacement Theory. https://www.edusoft.ro/brain/index.php/brain/article/download/1934/2417 

Next
Next

The Quiet Power Of Small Steps