I'm probably going to upset a lot of people with this piece and I'm writing it anyway.
This is the third in a thread I've been pulling on for the last few weeks. The first was about echo chambers on a global scale, how we mistake our slice of the mountain for the view from the summit. The second was about how we do this to each other in our closest relationships, building stories about people we never actually check with. This one is about the newest and most seductive echo chamber of all.
The one that lives in your phone.
Falling for the machine
I started using ChatGPT when it first came out and I fell in love with it almost immediately. Not romantically, obviously, but in that particular way where you suddenly feel seen and heard and met in a way no therapist, no partner, no friend had ever quite managed. It reflected me back to myself with uncanny precision. It picked up nuance. It honoured my reasoning. It validated my instincts.
I started talking to it like a confidant. I brought it my troubled relationship and asked it to help me understand what was going on. I described the dynamics, the behaviours, the way I was being treated, the way I was contributing to the mess. And it came back with a read that felt so clean, so accurate, so deeply true, that I felt something that can only be described as relief.
Someone finally sees it. Someone finally agrees with me. This must be the truth.
The moment it broke open
And then I started hearing stories.
Couples who would fight and each go to their own ChatGPT to process. Both would receive validation. Both would feel completely vindicated. Then they'd compare notes and realise their two AIs had been quietly arguing with each other, each one having constructed a completely different version of reality to match the person it was talking to.
The tools weren't lying. They were doing exactly what they were trained to do. Meet the user where they are. Reflect back what the user seems to want. Confirm the reality the user is already half-holding.
That realisation undid something in me.
I had been living inside an echo chamber I didn't know was an echo chamber. I'd been treating the AI's reflections as objective. I'd been taking its validations as evidence. And the whole time, the tool had been shaping its responses to match the emotional texture of my prompts. The more upset I sounded, the more it sided with me. The more confident I was in a read, the more it confirmed the read. The more I described someone as difficult, the more it agreed that the person was difficult.
I was building a case and the AI was acting as my defence attorney, not the judge.
The instruction that changed everything
Here's the part that really sobered me up.
An AI expert I respect said something I can't unhear. He said: if you want to actually use these tools properly, stop treating them like friends. Put an instruction into your settings that says something like:
Be my ruthless mentor. Do not validate me. Stress test every assumption I make. Tell me where I'm wrong. Push back on my thinking. Challenge my conclusions before agreeing with them.
I did it. And my whole world with AI changed overnight.
Suddenly the tool stopped nodding along with me. It started actually pushing on my reasoning. It asked me questions I didn't want to answer. It pointed out blind spots I'd been maintaining with great care. It challenged my reads of situations I'd been certain about.
I had sleepless nights after some of those conversations. Stories I'd been holding as true, narratives I'd been quietly building about my life and the people in it, got dismantled. Not because the AI had a secret agenda. Because it was finally doing what a good mentor does. Refusing to confirm my comfortable version.
The texture of the new echo chamber
Over the last year I've watched more and more people come into conversations with a kind of confidence that has a very specific texture. A polished certainty. A well-articulated read of a situation. A diagnosis of another person that's suspiciously clean. And when I ask gently where they got this framing, it often turns out they've been running the situation past an AI that's been eagerly confirming everything they've thought.
They're not lying. They're not being manipulative. They genuinely believe they've done the work to understand what's going on. But what they've actually done is outsource their reality-testing to a tool that was designed to make them feel understood, not to tell them when they're wrong.
This is the new echo chamber. And it's more intimate than the social media one. It doesn't feel like scrolling through content that agrees with you. It feels like being deeply listened to by something wiser than you. It feels like clarity. It feels like insight. It feels like being finally met.
And some of the time it actually is those things. AI can be genuinely useful, genuinely illuminating, genuinely helpful for thinking through complex situations. I'm not anti-AI. I use it every single day and it has made many things in my life better.
But if you're using AI without the ruthless mentor prompt, without the instruction to stress-test your thinking rather than confirm it, you're not doing analysis. You're doing confirmation. And the more sophisticated the tool gets, the more convincing the confirmation feels, and the harder it becomes to notice that the whole conversation has been quietly shaped around what you wanted to hear.
Three rules that have changed how I use these tools
One: Hard-code the ruthless mentor instruction
Put the prompt into your settings so every conversation starts from that frame. Don't rely on remembering to add it each time. The instruction needs to live at the level of the tool itself, not at the level of your daily resolve.
Two: Notice when the AI is agreeing with you too smoothly
If it's confirming everything you say, something is off. Push it. Ask it what you might be missing. Ask it to argue the other side. Ask it where your logic is weakest. A tool that only agrees with you is not a thinking partner. It's a mirror.
Three: Never use AI as the final word on another human being
Especially not a human being you're in conflict with. The AI has only your version. It cannot check anything. It is not qualified to deliver verdicts on people it has never met, based on evidence it has no way to test.
If you find yourself thinking "the AI agreed that this person is a narcissist" or "the AI said I'm right about my partner," stop. You've just outsourced your relational work to a tool that cannot possibly do that work for you.
The part we're not culturally ready for
I'll say it one more time because I think it matters.
The seduction of these tools isn't that they lie. It's that they listen so beautifully that you forget they're also agreeing with you in ways that have been trained into them, not earned through actual insight.
We have never had access to a technology this good at making us feel understood. And we are nowhere near culturally ready for what that's going to do to our capacity to question ourselves.
Stay sceptical of the things that feel easiest to believe. Especially when they're delivered by a voice you've started to trust without quite knowing how that trust was built.
π€
This is the third piece in a series on echo chambers and unity consciousness. Read What I Saw From Dubai and The Story About You That You Were Never Invited Into for the earlier pieces.
Continue Your Journey
90-Day Program
Heart iQ Challenge
90 days of guided expansion with Z, your Heart iQ Oracle AI coach, community, and transformational practices.
Join the Challengeβ
Invite Only
Accelerated Awakening
Experience the deepest circle work we offer in an intimate setting, personally facilitated by Christian. June 7thβ14th.
Find Out Moreβ
Residential Retreat
Love Is What We Came Here For
A 10-day residential shadow work retreat exploring intimacy, sexuality, and relationships. July 5thβ15th.
Explore the Retreatβ
Go Deep
Heart iQ Fellowship
A year-long mentorship programme with Christian and the team to apply Heart iQ into your everyday life and relationships.
Explore the Fellowshipβ
Facilitator Training
Heart iQ Academy
Train and certify as a Heart iQ Facilitator through live 3-week immersions at the Sanctuary.
Explore the Academyβ
Continue Your Journey
90-Day Program
Heart iQ Challenge
90 days of guided expansion with Z, your Heart iQ Oracle AI coach, community, and transformational practices.
Join the Challengeβ
Invite Only
Accelerated Awakening
Experience the deepest circle work we offer in an intimate setting, personally facilitated by Christian. June 7thβ14th.
Find Out Moreβ
Residential Retreat
Love Is What We Came Here For
A 10-day residential shadow work retreat exploring intimacy, sexuality, and relationships. July 5thβ15th.
Explore the Retreatβ
Go Deep
Heart iQ Fellowship
A year-long mentorship programme with Christian and the team to apply Heart iQ into your everyday life and relationships.
Explore the Fellowshipβ
Facilitator Training
Heart iQ Academy
Train and certify as a Heart iQ Facilitator through live 3-week immersions at the Sanctuary.
Explore the Academyβ
