Profile avatarPavel Svitek

Product Management in the AI Era

The role didn't disappear — it got distilled

For the past few years, there's been a persistent anxiety in tech circles that AI would make product managers redundant. Turns out that prediction was both right and wrong.

The parts of PM that felt like work — writing tickets, running analytics queries, drafting specs, moving cards around a board — those are largely gone. But the parts that actually matter? They've never been more important.

What AI has done to product management isn't elimination. It's distillation.

Marcus Moretti, a one-person team at Every building their writing product Spiral, put it well in a recent guide on agent-native PM. He's responsible for product management, engineering, customer support, and marketing — all at once. Necessity pushed him to figure out how to do real product work without the traditional overhead. What he landed on is a useful framework for anyone operating in this environment.

The shift he describes feels accurate: software development has moved from 20% planning and 80% execution to roughly the inverse. The tools that used to consume a PM's day now run in the background. What used to be a three-hour analytics investigation is a quick conversation with Claude. A product review that used to be a fortnightly ritual emerges from a single message.

That's the new baseline. The question is what you do with the time you've recovered.

The main loop hasn't changed. What fills it has.

Plan → Ship → Review → Repeat. This cycle hasn't changed. What has changed is where the leverage is.

Product management lives in the Plan and Review stages. Shipping is faster than ever — agents write the code, deploy the builds, and update the tickets. The constraint is now the quality of your thinking at the boundaries: what do you build, and did it work?

One useful reframe here is treating every shipped feature as an experiment. You never know exactly how users will respond. The more you ship, the more you learn — and those learnings compound. Once enough accumulate, you revisit the strategy and ask: does this still hold? If not, change it and get back to shipping.

The loop is simple. The discipline is in not breaking it.

Strategy is the most important artifact you'll produce

Before anything else — before features, tickets, or metrics — you need a strategy document.

This isn't a roadmap or a list of features. It's a one-page answer to: what problem are we solving, how specifically are we solving it, and for whom? Moretti structures his around Richard Rumelt's framework from Good Strategy Bad Strategy, and it's the right foundation.

A good strategy doc has five components:

  • Target problem — the recurring, expensive pain users experience today. Not a category. A specific situation.
  • Approach — one or two sentences describing your angle on solving it. This is not a goal ("make it faster"), not a feature, and not a generic positive statement. If you read it to someone experiencing the problem, they should be immediately intrigued.
  • Who it's for — a specific persona, not a demographic. Following the advice in Geoffrey Moore's Crossing the Chasm, focus on one persona early and nail it. Expand later.
  • Key metrics — three to five SMART metrics. Avoid page views and vanity metrics. Pick the ones that undeniably show people are getting value. At a minimum: people and dollars.
  • Tracks — two to four multi-month capability areas. Track one is almost always core performance or platform. More than four tracks is a signal you've lost focus.

Two optional sections worth adding: a "not working on" list (surprisingly useful for killing distraction before it starts) and a marketing/positioning section.

Critically, the strategy doc contains no product requirements. No specific features, no acceptance criteria, no statuses. It's the big picture. Everything else flows from it.

Pulse: your daily window into what's actually happening

A strategy document tells you where you're going. A product pulse tells you if you're moving.

The pulse is a single-page report — around 30-40 lines — that covers four things:

  1. Headlines — a handful of bullets on what matters most. If someone reads only the first three lines, they should know what's important.
  2. Usage — primary engagement events, value-realization events, conversions, and the current value of each strategy metric with a delta versus the prior window.
  3. System performance — latency percentiles (p50, p95, p99) against the prior window, and the top five error signatures by count. This section is skipped if no tracing tool is configured.
  4. Followups — one to five specific, actionable things worth investigating next.

The pulse pulls from four categories of data: product analytics (PostHog, Mixpanel, Amplitude), application tracing (Datadog, Sentry, Honeycomb), payments (Stripe, Paddle), and a read-only database connection for anything the other tools can't reach.

What makes it useful is how the agent uses it. It doesn't just assemble the data — it reads the report from the perspective of a founder, annotates anomalies, and runs follow-up queries where something looks off. There are no hard-coded thresholds. The agent uses common sense and comparison against previous reports. If average response times are suddenly three times higher, it flags it and investigates. If everything is normal, the followups section is thin.

Every run saves to ~/pulse-reports/ as a dated Markdown file. A single pulse answers "what happened today?" The folder answers "when did this trend start?" Over time it becomes the product's working memory.

Moretti runs his at 8am daily via Claude Code Routines. The day starts with a clear picture of how the product is actually being used.

Agents handle the busywork — you just talk about it

Tickets are the clearest example of how PM workflows have changed. The job of writing, updating, and organizing tasks now belongs to the agent, not the PM.

Your issue tracker needs an MCP integration — GitHub Issues and Linear both work. Once that's connected, you describe what you want to build, and the agent writes the tickets, assigns them, and keeps statuses updated. You don't read or write tickets anymore. You talk about what needs to happen, and the agent handles the rest.

The workflow simplifies accordingly. There's now/next/later instead of sprints. Two statuses: In Progress and Done. That's all you need.

What remains yours

For all of this, there are things that agents genuinely cannot replace.

Talking to users. There is no substitute. Moretti keeps his contact email conspicuous in the product so users reach out directly, and includes a 15-minute call booking link in every marketing email. The qualitative signal you get from real conversations is different in kind from what any dashboard can surface. You will never stop being surprised by what users say.

Product vision. Deciding what the product should eventually become — and why — is still a human judgment call. The strategy document captures it, but someone has to make the call.

Design judgment. Not visual design specifically, but the judgment about what to build and what not to. What trade-off is worth making. What complexity is acceptable. The agent can generate options; you still decide.

Knowing when the strategy is wrong. The pulse shows you the data. Reading it with the right interpretation, understanding when the numbers mean you should change course versus when they're noise, is still on you.

Product management has been reduced to the interesting parts. Dreaming up features, thinking through designs, looking at data, and talking to users. The economic argument for AI tools is obvious. But the better reason to embrace them is simpler: the work becomes more fun.

The drudgery is mostly gone. What's left is the part that was always worth doing.

Final thoughts

The transition to AI-native product management isn't a single change — it's a cascade of smaller ones that reinforce each other. The strategy document grounds everything. The pulse keeps you honest. Agents free you from the mechanics. And the recovered time goes back into the work that actually requires a human.

The goal isn't to automate product management. It's to do it better.