§13.2 · AI Is the New OS. The Mission Is the Only App That Matters.

AI in the Boardroom

I run product with Claude as a teammate. It reads my epics and asks me what I missed. It builds wireframes from a written brief in minutes. It summarizes a competitor’s last six press releases before my coffee is cold. It drafts sections of a roadmap I would have spent an evening on. It is the closest thing to a senior editor and a junior designer and a research analyst in your pocket that has ever existed.

But it does not feel what the user feels.

That is the seam in this whole conversation. The discourse around AI in product management has been quick to declare the role obsolete. Engineers can prompt their way to an epic. AI can summarize a customer call. Anyone with the right tools can ship a working prototype before lunch. The argument goes: if all of that is true, what does the PM still do?

Let’s be honest about the answer. Robotic is already the word we use when a person responds without empathy. “That answer was robotic.” “Don’t be so robotic with the customer.” It is not a compliment when we apply it to a human. So why would it be the goal when we apply it to how products get built?

We do not want robots building experiences. We need empathy. We need relation. We need a human.

That motto — give a shit — is the line. A model can render empathy convincingly. It cannot live it. It can mimic the language of caring. It cannot actually care. The Purple Foxes flew into enemy fire because someone on the other end was bleeding. That is not a workflow. That is not a prompt. That is what humans do for other humans when nothing else will do. It is also what product work, at its best, is. And it is the part of the work AI is not crossing.


The whole system runs on one input: presence. Did you actually sit with the user, or did you let AI summarize the call? Did you watch them try the workflow, or did you ask the model what frustrated them? Did you carry their problem home with you, or did you tag the recording and move on?

AI cannot guess. It can only respond to what you give it. Give it your assumptions and it gives back a confidently written version of your assumptions. Give it a real problem you have actually witnessed and it gives back leverage. The difference between a PM who is good with AI and one who is great is whether they have actually been in the room.

That is the trade. AI gives you a teammate. You give AI your time with the user.


Here is what I have come to believe. AI does not replace the PM. It exposes the people who were never doing the empathetic work in the first place.

Melissa Perri, who has trained more product leaders than almost anyone in the field, frames it cleanly: AI excels at the how. You own the why. That is the right line. AI is a stunningly good how engine. Specs, drafts, competitive scans, design variations, prototype builds, edge-case enumeration — all faster, all cheaper, all more thorough than what a human alone produced two years ago. But the why has not moved. The user. The mission. The right problem to solve in the first place. That is still a question only a human in the room with another human can answer.

This is also why the LinkedIn experiment is worth taking seriously. Tomer Cohen, LinkedIn’s Chief Product Officer, has been replacing traditional PM roles with what he calls Full Stack Builders — generalists who design, prompt, prototype, and ship end-to-end with AI rather than handing off across functions. The structural argument is sound. The role boundary is doing less work than it used to. I do not disagree with him on the mechanics.

But the question is not whether you call the role PM or full-stack builder. The question is whether the person doing the work can feel what the user feels. A full-stack builder who can do that is a product creator. A full-stack builder who cannot is shipping faster mediocrity in a smaller org chart. The title is downstream of the empathy.

Marty Cagan has made this point for years, and AI made it urgent. The title PM is not protected. The work of product creation is. People in product manager roles who never developed product sense are vulnerable. People in any role who can sit with a user, name the real problem, and own the outcome are the ones who matter. AI raised the floor on execution. It did not raise the floor on judgment.


The PM’s epic itself is changing because of this. A great epic in 2026 should be readable to AI. It should explain the problem clearly enough that an AI editor can ask sharp questions back. It should describe the experience cleanly enough that an AI prototyper can wire a working version. It should position competitively in language an AI strategist can stress-test. The best PMs I work with use that as their bar. If I uploaded this epic to Claude, would the AI understand the problem? Would it know what experience solves it? Would it know what competitors are doing nearby? When the answer is yes, you have written a real epic. When the answer is no, you have written a paragraph that hides what you do not yet understand.

But here is the catch. Pawel Brodzinski, who writes about this more clearly than most, names a structural truth the discourse keeps missing: AI by definition pulls toward the median. The model produces what the data’s center of gravity says is most likely. Creativity, by definition, lives at the fringes of the distribution. So an AI-generated epic is by definition a median epic. Confident. Clean. Average. The PM’s job is to be the deviation that finds the actual user insight the median was never going to surface.

AI gives you a faster median. The human is the difference between median and meaningful.


What AI gives the PM in 2026 is access. The PM at a five-person startup can produce at the pace of a PM at Stripe. The PM with three product surfaces can keep up with all three. The new PM ships in their first month at a level that used to take a year to learn. The floor of the work just got higher.

The ceiling is still the same place it has always been. As far as you are willing to sit with a user and care about their problem.