This week our team attended an eye-opening workshop at IAB’s 30th Annual Leadership Meeting focused on how large language models and AI search are upending the economics of the open web for publishers. Insights shared by Jonathan Roberts, chief innovation officer at People Inc. and IAB Tech Lab’s Shailley Singh and Hillary Slattery built on themes that have already begun to dominate industry discourse in 2026.

Here’s what premium publishers need to understand and act on now.
The Traffic Collapse Is Real — Not Hypothetical
In his January column, The AI Search Reckoning Is Dismantling Open Web Traffic – And Publishers May Never Recover, AdExchanger’s Anthony Vargas notes that generative AI hasn’t just altered search, it has fundamentally changed how monetization works on the open web.
Publishers have reported traffic declines of 20%, 30%, and in some cases as much as 90%, driven by zero-click AI search summaries and answer engines that keep users on the platform and off publisher sites.
This isn’t theoretical. Across verticals, from news to niche blogs to ecommerce, the sustained decline in referral traffic is reshaping the economics that publishers have relied on for decades.
The Fundamental Shift: From Traffic to Contribution
At the IAB workshop, participants repeatedly came back to the idea that traffic is no longer the primary currency.

In the old world, search engines aggregated links and sent visits downstream. In the AI era, branded summarization and agent-driven discovery extract the value before a click happens. This means:
- Users increasingly get answers without visiting publisher sites
- “AI Overviews” drastically reduce click-through rates
- Traditional referral traffic-based advertising models are eroding
Vargas’ piece made this tangible with real performance data showing how AI search is eating into organic referrals, even for high-quality content.
This reinforces what we heard at the workshop: publishers must shift their thinking from “protecting traffic” to “monetizing contribution.”
Blocking Isn’t a Solution — It’s a Tactical Response
Many publishers responded to early AI bots by tightening robots.txt and blocking crawlers. Roberts made it clear why this alone won’t protect value:
- Blocking invites anonymity and spoofed agents
- It often blocks more bot traffic than real user traffic
- It doesn’t establish permissions, provenance, or compensation
This reflects a real market truth: simply hiding content doesn’t create economic leverage. Instead, publishers need frameworks that declare who is accessing content and under what terms.
CoMP: A Foundation for Rights, Not a Price Regulator
The Content Monetization Protocol (CoMP) introduced by the IAB was presented as a standards-first framework for managing AI agent access. CoMP is designed to:
- Allow machines to declare identity and intent
- Support permissioning and licensing at scale
- Track usage through tokenized authentication
- Separate discovery from downstream usage and monetization
This matters because the current ecosystem has no standardized way of signaling rights to AI platforms. Publishers either give content away for free or block it — neither of which yields compensation in an AI-driven discovery world.
There Is Real, Payable Demand — If You Can Capture It
One of the most encouraging themes of the workshop was that the demand for trusted content is not imaginary:
- LLM operators already work with rate cards (often cited in the industry as $10–$30 CPM at scale)
- Enterprise buyers have budgets and workflows tied to high-quality insight
- A growing number of agents beyond major chatbots are surfacing value (e.g., specialized assistants, tool-chain agents, vertical search)
The issue isn’t a lack of value. It’s that publishers have not yet established the standards and signals needed to capture that value in a machine-mediated world.
The Road Ahead: Discovery, Rights, and Premium Content
Here’s how we think publishers should be preparing:
1. Treat bots as a class of users – Measure their value, track their interactions, and establish identity, not just block them.
2. Signal rights and intent clearly – Publishers need machine-readable metadata: rights, permissions, usage conditions, so AI systems understand what they can and cannot do.
3. Separate discovery from usage monetization – Discovery can be public, but usage (summarization, training, reuse) should require consent and potentially compensation.
4. Build or join content marketplaces – A marketplace layer could bring relevance, quality, and rights data to the surface in ways traditional search never did.
5. Diversify beyond referral traffic – Subscription, direct licensing, APIs and usage-based models are becoming more important as click-based ad revenue declines.
What This Means for Premium Publishers
The open web has entered an AI economics era, not just an AI technology era. The impact of AI search is not modest or transient, it is dismantling old traffic models.
Workshop participants and Roberts underscored that without new standards, publishers will continue to lose influence and revenue.
“The way publishers have traditionally measured success—by traffic—is changing fast. AI search is rewriting the rules, and zero-click answers mean fewer clicks, not less value. The real opportunity is in recognizing the contribution publishers make to this new discovery landscape and creating clear, actionable ways to monetize it. At Nomix Group, we’re focused on building systems that don’t chase illusions of old traffic but instead capture real value where commerce actually happens.”
— Todd Ulise, Chief Revenue Officer, Nomix Group
But the good news is that standards like CoMP, combined with strategic rights management and monetization frameworks, offer a pathway forward. Publishers who engage early with these protocols, build machine-readable rights signals, and lean into new discovery markets will have an advantage in the next chapter of content economics.
