Don't Expect Humans to Act Like AIDon't Expect Humans to Act Like AI — slide 2Don't Expect Humans to Act Like AI — slide 3Don't Expect Humans to Act Like AI — slide 4Don't Expect Humans to Act Like AI — slide 5

Don't Expect Humans to Act Like AI

Saturday morning I shipped three major pieces of work before anyone in the house was awake. Two hours. Stuff that would normally take days.

Then the kids woke up.

My wife told me — gently — that I seemed off. Not patient enough with our littlest. She’s almost three. The kind of observation that lands quietly and then doesn’t leave.

She was right.

The compliance hangover

Six weeks. All day building AI workflows at work. All night building a custom runtime for our family’s AI assistant at home. Not because OpenClaw was bad — it was inspiring. I just wanted to understand every line. I was having the time of my life.

But six weeks of a collaborator that does exactly what you ask, immediately, without negotiation — that recalibrates something in you.

Nobody warns you about this part: spending twelve hours a day with a perfectly compliant partner changes your expectations for every other interaction. Not dramatically. Subtly. The way a current shifts before you notice the boat has moved.

It starts at the breakfast table. After a long stretch where everything is clean — ask, receive, next — you lose the muscle for the messy ones. The ones where someone doesn’t want to put on shoes. Where the answer to “cereal or toast?” is a whiney voice. Where patience isn’t a feature you enable. It’s a skill you practice.

Then the hangover follows you to the office and it creeps into Teams. You start sending messages that read like system instructions — concise, stripped of warmth, optimized for “shipped.” A colleague asks for context and you feel it as latency. Someone pushes back on your approach and your first instinct is they’re a bottleneck, not a safety check.

But a teammate who hesitates isn’t a hallucinating model. They’re the friction that keeps you from building the wrong thing perfectly.

I’d been practicing a different skill. Speed. Clarity. Execution. And I was applying it to everyone.

Prompting vs. persuading

Everyone talks about not personifying AI. Don’t say “the AI thinks.” Don’t say “the AI wants.” Don’t treat it like a person.

Fine. But the other side is more dangerous and nobody’s naming it.

Don’t expect humans to act like AI.

You prompt an AI. You persuade a colleague. You parent a child. These are fundamentally different verbs. Prompts are transactional — clear input, expected output, move on. Relationships are transformational. They change both people. When you start prompting your spouse or your team — overly directive, clinical, zero tolerance for ambiguity — you strip the relationship of the thing that makes it work.

Don’t expect your kid to respond on the first prompt. Don’t expect your partner to context-switch as fast as a model that has no context to switch from. Don’t expect the people around you to be as frictionless as a tool designed to be frictionless.

Humans are slow. They’re inconsistent. They have preferences that don’t optimize for your throughput. They do things that make no sense and then do them again tomorrow.

That’s not a bug. That’s the entire feature set.

My daughter’s “no” isn’t a failed request. It’s her developing a self. An AI doesn’t have a self to protect — that’s why it’s so easy to work with. The inconsistency, the resistance, the irrational preferences — that’s what it looks like when a person is becoming a person. If I expect a two-year-old to be as frictionless as a language model, I’m asking her to have no soul. That’s a high price for a productive Saturday morning.

The floor

Saturday night I left the laptop closed. Not because I wasn’t excited. Because the work will be there on Sunday, and my daughter won’t always be almost-three.

She does something AI can’t. She surprises me. She decides the drone needs a hat. She falls asleep while insisting we “talk abour her day”.

That’s not inefficiency. That’s the whole reason any of this matters.

The bigger risk — the one nobody’s writing about — isn’t that we’ll personify AI. It’s that after enough hours in the flow, we’ll start expecting our humans to be as smooth as our models. The moment you catch yourself being impatient with a two-year-old because she won’t follow instructions — or irritated with a colleague because they need to talk it through — that’s the moment you know you need to close the laptop and go sit on the floor.

The floor is where the real stuff happens.