STRATEGY AFTER INTELLIGENCE
A Three-Part Series on What Happens When AI Develops Taste
We spent the last two years treating Artificial Intelligence like a very fast, slightly drunk intern. It could summarize a PDF or write a boilerplate email, but you wouldn’t trust it with the keys to the company.
That era officially ended this month.
In his viral essay Something Big Is Happening, Matt Shumer pointed out a terrifying, exhilarating truth about the latest models (like GPT-5.3 Codex): they are no longer just executing instructions. They are making intelligent decisions. They are exhibiting what feels, for the first time, like judgment. Like taste.
But here the massive opportunity. AI can synthesize a billion data points to simulate taste, but it lacks lived experience. It has no humanities, no flesh-and-blood intuition, and no real cultural context. It understands the mechanics of human behavior, but it cannot organically align with human culture.
If intelligence itself becomes infrastructure, something everyone has access to, like electricity or bandwidth, then what happens to strategy?
And if you are in the business of strategy, this gap is your new empire.
PART I: When AI Develops Judgment, Strategy Changes
The Pivot: From Out-Calculating to Out-Feeling
Here’s where it gets weird.
Strategy used to mean finding the right answer. You did the research, ran the numbers, synthesized the insights, and arrived at the recommendation.
Today, logic is a commodity. An AI can run a thousand Monte Carlo simulations and spit out the statistically optimal business move in ten seconds. But when every single one of your competitors has access to this exact same “perfect” analysis, logic drops to a baseline requirement. It is no longer a competitive moat.
The value of producing intelligence declines.
The value of endorsing a decision rises.
This is subtle but profound.
In an AI-saturated environment, strategy becomes something different: selecting the right answer from many optimal ones.
This isn’t semantics. All of them logically sound. All of them defensible. All of them “correct” according to the data.
Your job is no longer to produce the analysis. Your job is to look at ten perfectly valid options and know [intuitively, culturally, temporally] which one actually has the soul to work in the real world.
AI increases the volume of viable options. That increases decision complexity. And decision complexity increases the risk of choosing wrong.
Boards and executives are going to face a strange new paradox: more intelligence, less clarity.
The advantage shifts from those who can produce analysis to those who can filter it. The ones with taste.
What “Taste” Actually Means
The ability to select among multiple optimal paths based on cultural alignment, timing asymmetry, brand coherence, and long-term positioning, rather than short-term optimization.
Taste optimizes for trajectory. It asks: where is this going? What's about to shift? What does this choice signal about who we are? Culture leans forward. Taste anticipates shifts before they become statistically visible. Strategy isn’t about predicting the median outcome. It’s about positioning for the asymmetric one, the outcome that changes everything if you’re right.
“But Won’t AI Eventually Close the Gap?”
Fair objection. If models keep improving at modeling human behavior, won’t they eventually internalize cultural nuance? Won’t they develop real taste?
To a degree, probably yes. The gap will narrow.
But here’s the thing: models are fundamentally retrospective. They’re trained on what already happened. They’re very good at predicting what’s likely based on patterns that already exist.
Culture doesn’t work that way. Culture is forward-leaning. The most valuable strategic insights aren’t about what’s probable: they’re about what’s possible. What’s emerging. What’s about to break through.
By the time something shows up in the training data, the window for strategic advantage has often already closed.
Taste is the ability to see around corners. To feel the vibe shift before the data confirms it. That’s not something you can optimize for. It’s something you cultivate through lived experience, cultural immersion, and a kind of intuition that doesn’t reduce to pattern matching.
Cultural Capital in Action
If you want to see what happens when perfectly calculated logic meets the messy reality of human culture, look at the sports, retail, luxury and lifestyle sectors. They are the canary in the coal mine for the new economy. AI cannot curate the high-context cultural nuances required to make an Asian flagship store resonate locally while maintaining global prestige. Data gives you the map; human taste dictates the aesthetic, the localized partnerships, and the cultural alignment. Without that cultural intelligence, a brand isn't expanding; it's just exporting inventory.
What This Means for Leaders
So AI doesn’t eliminate strategy. That’s the good news.What it eliminates is differentiation through analysis. The game has moved.
The premium now accrues to leaders who can:
Institutionalize selection discipline. Not just making good decisions, but building systems and cultures that consistently filter AI outputs well.
Define cultural coherence at the top. What do we stand for? What trade-offs do we refuse? These become strategic assets, not just brand fluff.
Govern AI outputs rather than just deploy them. The question isn’t “how do we use AI?” It’s “how do we decide what the AI’s suggestions actually mean for us?”
This shift isn’t technological. It’s epistemological. It’s about how we know what we know, and how we decide what to do about it.
See also:


probably and not so obvious
I see strategy as some form of ‘seeing what’s missing’ along with ‘where the puck is going’ to find the opportunity you can take advantage of.
You know: “a better mp3 player with an integrated online shop where you can purchase individual songs” or “improve graphics processing boards to mine bitcoins and the to process data for AI”.
I doubt AI can do that for a while