Me and my AI buddy

People sometimes ask if I'm scared of being replaced by an AI program.

Typically, this is something mentioned after the sharing of their feelings of uncertainty around their role as a designer or leader in the face of AI.

My response:

"What do you think, should I be? Should you?"

Most don't have a detailed response, but I then ask if they would be interested in an "AI version" of me (this is half serious).

I have asked 20 people this question.

Not one person was interested.

I even prompted them:

"Wouldn't it be good to have access to our conversations even when we don't see each other?"

"No, I like having an appointment in my calendar for these sessions. That means they will happen. I look forward to them"

I admit that this was a small, skewed sample, but it really got me wondering.

Coaching for me

In any creative act between two persons, there is someone bringing something to life (I consider coaching a creative act). There was a real, thinking, feeling person on the other end of this creation. When you engage with their work, it creates a relationship between you and them. The real value of coaching is the connection between two conscious beings, one holding the other while focusing on their inner growth.

My Co-active Coaching training teaches the principle of the “Coaching Power”.

Coach and thinker have equal, though different, roles.

They collaborate for the client's benefit. Both derive energy not from each other as individuals but from the relationshipbetween them.

Powerful coaching isn't about being a powerful coach. It's about the power the thinker experiences through this relationship.

Some issues I have with AI coaching

The data problem

What datasets is the AI trained on, and how?

When we think about 'AI Coaching', we need to know: is the LLM reading Freud or Facebook (or worse)?

Was it trained by skilled coaches or marketing professionals with week-long training?

The difference matters.

If we respect our clients, ourselves and our craft, we need to be clear. Do no harm.

The ethics problem

The ethics and business practices of companies building these solutions.

Christina Wodtke evaluates these companies in "I Love Generative AI and Hate the Companies Building It" (recommended).

Christina writes:

"I love this technology, but hate how these companies are making it."

Almost all companies incorporate some questionable business practices - environmental, labor, mental health, safety and transparency. Users must decide which compromises they are willing to live with. Really?

This makes me sceptical of them offering tools to people desperate for connection who are mentally delicate. This is too important to compromise, and even less for a "oops, we didn't think of that" approach.

The Amish approach

Rather than asking "Will AI replace coaches?" we might learn from the Amish. They don't reject all technology. They ask: "How does this technology serve our people?"

My personal relationship with technology is very much inspired from this philosophy. I am open to new technology, but only if I can see how technology effectively serves me and my goals.

For coaching: "How does AI serve the people we coach more effectively?"

This shifts everything. Instead of efficiency or novelty, we focus on the thinkers:

➔ Does this AI tool strengthen or weaken the coaching relationship?

➔ Does it help the thinkers develop self-awareness or create dependency?

➔ Does it align with our core values?

➔ What are the long-term effects on the people we serve?

This allows thoughtful adoption. We might find AI tools that genuinely serve thinkers: session preparation, tracking progress, and providing resources. But we evaluate through service to clients, not adoption for its own sake.

A bottom line

So, will I be replaced by an AI coach?

I have no idea for sure.

What I do believe is that human coaching remains irreplaceable, not because AI can't learn techniques and ask questions, but because healing and learning happen in the connection between two human beings.

No algorithm can replicate that.