top of page

AI Can Write Code But Can't Comfort a Crying Child


There's a popular idea floating around right now that AI is going to take over the easy jobs and leave humans with the hard ones.


It's backwards.


AI is extraordinary at the hard things. Give it a complex coding problem, a research brief, a financial model, a 10,000-word document to summarize. It will outperform most humans, most of the time, in ways that would have seemed impossible five years ago.


But ask it to sit with a crying child until the crying stops? To notice that someone's laugh sounds slightly off today? To walk into a room full of nervous 10-year-olds on their first night away from home and make every single one of them feel like they belong? That's where AI goes completely quiet.


The easy things, as it turns out, are what make us irreplaceable.


What AI Actually Can't Do


Researchers at Stanford looked at how AI responds when users share thoughts about mental health crises. In roughly one in five cases, the AI was unable to provide a clinically appropriate response. Licensed therapists, by comparison, respond appropriately 93% of the time.


That gap is not a software problem. It's not something the next model update will close. It reflects something structural about what AI is and isn't.


AI can simulate empathy. It can recognize emotional language, generate warm responses, and produce text that sounds supportive. What it cannot do is actually feel what someone else is feeling. A 2025 review in Frontiers in Psychology put it plainly: AI can identify sadness, but it cannot feel sorrow. It can generate comfort, but it cannot care. What looks like empathy from an AI is, at its core, a very sophisticated pattern match.


People notice the difference, even when they can't name it. Research comparing AI-generated and human-written stories found that people reliably felt less empathy toward AI responses, even when the words were nearly identical. Something in the interaction registers as slightly off. The texture is missing.

That texture is exactly what camp is built from.


What Camp Trains Humans to Do


Think about what a good counselor actually does in the first 48 hours of a session.

They read a group of children who have never met. They figure out who is quietly panicking and who just looks that way. They pick up on the child who laughs loudest at dinner but won't make eye contact. They adjust their voice, their energy, their approach, dozens of times a day based on information they couldn't fully explain if you asked them.


None of that is in a manual. None of it runs on pattern recognition. It runs on something closer to intuition built from genuine care, genuine presence, and the accumulated experience of actually being around other people.


This is what camp professionals develop that workplaces, schools, and most institutions are still trying to figure out how to grow. The ability to read a room. The willingness to slow down for a person. The skill of being fully present when presence is what the moment needs.


These aren't soft skills. They are extraordinarily difficult human skills that AI cannot replicate, and they are becoming more valuable the more we automate everything else.


The Paradox Worth Sitting With


Here's what I keep coming back to: as AI gets better at the tasks we used to think required human intelligence, the things it consistently cannot do look more and more like what the best camp directors, counselors, and educators have always done.


Notice the individual in the crowd. Stay regulated when the group is falling apart. Create conditions where people feel genuinely seen. Repair a relationship after a mistake.


A 2025 review of AI and emotional wellbeing found that tools designed to reduce loneliness can sometimes intensify it, satisfying enough of the social need to prevent people from seeking deeper connection. It's a strange kind of trap. The better AI gets at approximating human warmth, the more important actual human warmth becomes.


For camp professionals, this isn't a warning. It's a clarification of something many of us already sensed. The thing we do all summer is not a holdover from a simpler time. It is a practice of the most durable form of human intelligence there is.


You cannot automate a campfire. You can build an AI that talks about campfires, describes the smell, generates a playlist of crackling sounds. But the thing that happens when people sit in a circle together around an actual fire, in the dark, telling the truth? That requires humans. Present ones.


What This Means for How You Lead


If AI is taking over the cognitive heavy lifting, the people who thrive going forward will be the ones who invest in developing the capabilities AI cannot touch.

Presence. Attunement. The ability to repair trust after it breaks. The patience to sit with someone in discomfort without rushing to fix it.


One thing worth considering: are you creating enough space in your organization for these skills to actually be practiced? Not just valued on a list of core competencies, but actually practiced, through the kinds of messy, unscheduled, human interactions that used to happen naturally and now require a little more intention to protect.


Camp figured this out first. The rest of the world is catching up.


About the Author

Matt Kaufman has spent 40 years in summer camp as a camper, counselor, and director, studying what makes people belong, grow, and thrive. He writes about intentional community, leadership, and the intersection of technology and human connection.


Connect with Matt:

  • Instagram: @mattlovescamp

  • LinkedIn: Matt Kaufman

  • Website: ilove.camp


Books by Matt Kaufman:

  • The Campfire Effect: How to Engineer Belonging in a Disconnected World

  • The Summer Camp MBA: 50 Leadership Lessons from Camp to Career

Comments


Join the Discussion

The best ideas in camping start with conversation.
Be part of a growing community of camp professionals who share, learn, and inspire each other.

bottom of page