
Human traffic to publisher websites is now in decline, according to data from AI licensing start-up Tollbit.
Tollbit’s latest State of the Bots report provides a snapshot of AI web crawler activity on their approximately 400 partner websites.
In the period from April to June this year, these sites – a broad sample of publishers covering consumer lifestyle, news and special interest titles – saw a 9.4% decline in human-originated page requests versus the previous three months.
Meanwhile, requests from AI bots increased sharply, representing one in 50 visits, up from one in 200 at the start of 2025.
Human vs AI bot requests, per website (log scale)

This data provides further evidence for what many digital content businesses will already suspect: that AI substitution is taking place, and causing economic harm, now and at a meaningful scale.
The data bears out the obvious: as these systems scrape the internet in real-time and provide summaries – either in search or through chatbots – users do not need to visit publisher websites as frequently as they did when they relied on ten blue links to navigate the web.
Irrespective of where you sit on the bull to bear spectrum when it comes to AI (and I think there are compelling cases to be made both that advancements are slowing and the fundamental capabilities of LLMs are more modest than has been touted), this technology is permanently transforming everyday information retrieval.
For obvious reasons, Google is at the centre of this shift. Tollbit’s data also provides a new window on how the search giant’s relationship with website owners is changing.
Multiple studies have already demonstrated that the referrals Google is sending to publishers from search are in decline as it rolls out new AI search features.
Since the rollout in more than 100 countries of AI Overviews at the end of October last year, we now know it has also been scraping sites ever more aggressively.
Googlebot requests trend upwards after widespread rollout of AI Overviews (0 = 28 October 2024)

Its systems appear to need to access content in real-time to summarise and provide answers, as well as asynchronously to create the index of the web that powers its core search product. These dynamics are likely to be important drivers of the transfer of publisher engagement from human to bot. They also are a signal of the changing value exchange between content businesses and Google – publishers are giving more away with less being delivered in return.
As Press Gazette has covered extensively, publishers are being forced into this arrangement because Google is tying access to content for AI to visibility in Search (and increasingly Discover).
This conduct raises obvious competition issues and is facing legal action and regulatory scrutiny in the UK and elsewhere. The pace of regulatory action means that much harm may be done before interventions are made though.
Tollbit’s findings also point to a longer-run but more fundamental change in how we all interact with the digital world. They suggest that the future will be one in which an ever-increasing amount of our time is spent within walled gardens. Already around a third of our online time is spent on social media platforms which algorithmically serve content to us based on our habits and past behaviours.
Add to that a – likely rapid – expansion in the use of AI systems which keep us locked in a singular interface operated via text or voice prompts. In the near term these are AI search or chat applications that provide a natural language answer, negating the need to visit a website.
Looking further – but not far – into the future are fully-fledged AI assistants that go substantially further. These will both fetch information and complete actions – possibly unprompted – on our behalf, based on an ever-expanding dataset about our lives and preferences.
In a court filing from last week Google acknowledged this directly, bluntly stating what it has been publicly denying for a long time: “the open web is already in rapid decline” (with the punchline being, of course, that it should not be forced to divest part of its tech stack that gives it the dominant position on both sides of the digital advertising market).
Extract from Google court filing:

In this future world of walled gardens, where AI applications handle many of our interactions in the digital realm, publishers will need to adapt their strategy to succeed. In the mid-run, this means understanding how they can continue to draw real human audiences to their owned-and-operated properties (ideally without relying on an intermediary to get them there) as AI adoption continues.
This requires some analysis by publishers. They first need to identify the gaps in user needs that will be left unserved by the current and coming wave of AI applications. These need to be references against the space an individual publisher can valuably occupy for its audience.
Whilst the outcomes of this work will differ from title to title, universally publishers would do well to sharpen their focus on growing brands, innovating in product environments, producing deeply-engaging, rich format journalism and commissioning inherently-human content that cannot be substituted by a summary (from renowned experts, featuring humour, opinion etc.).
Licensing will also play a growing role. The media cannot – and should not seek to – hold back the advance of this technology. In future many user needs that are currently satisfied by publisher websites will be served by walled garden AI services.
But licensing means the introduction of intermediaries in media value chains, which carry risks. A model to understand the relationship between the core owned-and-operated revenue and licensing is therefore essential. Absent this, deals risk ceding too much value as they threaten future core revenue.
Disclosure: David Buttle helps Tollbit develop its State of the Bots reports.
Email [email protected] to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog