I recently watched a great talk by Matthew Prince, the CEO of Cloudflare (NYSE: NET) on the impacts of AI on search traffic and content compensation. I had some thoughts.
TLDR
Page views are dead. AI scrapes without sending anyone back.
Traffic looks alive but delivers no value.
Scarcity is the only way to regain leverage.
Cloud infrastructure holds the power to block unlicensed crawlers.
The next business model pays for knowledge gaps, not impressions.
A growing argument for content restriction is emerging
The web runs on a broken deal.
AI models now scrape content, summarise it, and serve it to users with no need to click through. Publishers get visibility but no return. Page views mean nothing if there is no revenue and no leverage.
I am making a case for a full reset.
Yes, that means:
Block the crawlers.
Create scarcity.
Shift the model from exposure to value.
Let infrastructure providers enforce the rules. Build a new economy where creators get paid for filling the knowledge gaps that machines cannot close alone.
The more we tie success to the zombie metrics, the more we lose.
The traditional view doesn’t make sense anymore.
Page views used to pay. More traffic meant more clicks. More clicks meant more money. Now, Google answers on the search page.
AI gives a full summary before the user even scrolls. Headlines still appear, but no one clicks. The result is a business model that pretends to be alive.
Traffic goes up, revenue goes nowhere. A ghost town with good numbers. Ten years ago, two pages scraped got you one reader. Today, it’s eighteen pages to one. With AI tools, it’s even worse.
The facts:
OpenAI scrapes 1,500 pages per visit.
Anthropic scrapes up to 60,000.
The web is training machines that never come back.
Scarcity should be the new strategy
Every industry that regained value did it by limiting access.
Music held back until Spotify paid. Sports bundled rights before broadcasters paid attention.
Web publishers still give everything away and hope traffic will return. It will not.
Scarcity changes that. Block unlicensed crawlers. Force a pause. Let creators choose when and how their work is used.
Without scarcity, any deal made today gets undercut tomorrow. With scarcity, the market resets around consent and price.
The power now sits in the pipes
You cannot stop bots from your CMS.
They rotate identities, bypass detection, and resurface as something new. The power to stop them sits deeper.
Cloudflare already filters half the world’s DDoS attacks in real time. The same filter can block unapproved crawlers. Prince hinted at a move that would shut out rogue bots from 20 percent of global traffic instantly. Do that across a few major providers, and AI models lose access fast. The gatekeeper is not the website. It is the infrastructure layer. That is where the leverage lives now.
Don’t sell clicks, sell holes
Scarcity only works if you know what to charge for.
Page views are too blunt.
LLMs are not blank slates. They are Swiss cheese. Yes, Swiss cheese.
They hold most of what users want, with very specific holes. Gaps in regulation, in edge cases, in rare domains. Those gaps are signals. That is where the next model pays.
Instead of publishing to attract impressions, creators publish to fill documented holes. Spotify already does this with music. It posts what is missing. Composers fill the gap. The same system works for knowledge. Depth gets paid. Noise does not.
When rage does not pay, publishers stop feeding it.
If content earns money by being right, not being viral, the whole system slows down. Attention resets around clarity. Readers get better answers. Models cite real sources. And those sources earn something for their work.
When models summarise facts, they trace the chain back to a licensed origin. That loop only works if content is not free for anyone to take.
The moment scarcity returns, trust begins to scale.
No scarcity, no survival
This is not a new phase. It is the end of the old one.
Free content has been strip-mined for value. The platforms have trained their models. They do not need clicks anymore. They do not need publishers.
Unless the web changes its rules, this becomes permanent. Cloud infrastructure can change those rules overnight. Block the bots. Create price signals.
Let creators get paid to teach what machines cannot guess. That is the reset.
Without it, the obituary is already written.
For those who want to watch the complete talk, you can do so here:



