
On September 10th, 2025, Google made a major, but not-immediately-apparent, change to how the SERP is populated. Users likely didn’t notice any difference, but marketers and fellow SEOs are up in arms. In my own world as an SEO consultant, this update has changed how I perceive and share the narrative of my clients’ performance.
Read on to learn more about the recent change to SERPs and what it means for you.
What Happened?
The update in question has to do with how Google’s SERP is loaded and presented to users. We all know that Google shows 10 organic listings on each page of the SERP. Aside from a brief testing period with an unlimited scroll, which I wrote about previously, the rule of thumb has always been 10 results per page.
However, up until September 10th, one could force Google to show a SERP with up to 100 results by including a small parameter in the SERP’s URL. “num=100” would override the SERP’s default and produce a SERP with 100 results.
You may be thinking, “who would ever do that?”
Exactly! No one! At least, no person. But, bots were using this parameter to scrape the SERP’s top 100 results rather than having to conduct 10 separate crawls. This includes bots like those powering the rank tracking tools we all use to monitor and analyze our brand’s organic visibility (Moz, Ahrefs, SEMrush, STAT – everyone was impacted).
It’s not that they can’t crawl SERPs anymore, it’s just that they can only see 10 results at a time. And where each query, or each SERP, costs a few cents to crawl, there’s no cost effective way to crawl the top 100 results anymore.
What Else Was Impacted?
While the rank trackers work out a new plan (most are just offering the top 20 results instead of the top 50 or top 100 they reported previously), there’s one data tool benefiting from this change: Google Search Console.
Around mid-September, you may have seen your GSC impressions drop significantly, even though your clicks have remained mostly the same.
This drop in impressions stems from the fact that, unlike Google Analytics, bot traffic is factored into the data you see in GSC. And bots are no longer crawling SERP results in positions outside the top 20 or so. Before, each time a bot would force a 100-result SERP, it would award an impression to each domain ranking in the top 100 positions.
But since these bots are now only crawling the top 20 results or so, your site has likely lost a bunch of impressions from bots for its pages ranking in these lower positions.
The good thing is that these weren’t ever real impressions anyways. People really only ever see results in the top 20 or so positions, so these new impressions are likely more human-focused and more actionable.
Sure, there are still bots giving impressions in the top 20. But this is more aligned with the actual user experience. I hate to be the one to tell you, but those impressions you earned in 2024 from that page with an average position of 80 were probably bots.
Why Did Google Do This?
This is the most interesting part! As we said earlier, this change really doesn’t impact the user at all. They initiate a search, they see 10 results. Just like they always have. So why would Google, a company who prioritizes user experience, make this change?
The answer is the same as the culprit behind so much volatility in the industry: AI.
ChatGPT, and other AI engines, are some of the most prolific Google searchers out there. If you ask ChatGPT a question that it determines requires more, or more recent, information than it currently has, it performs a search.
You see this in the ChatGPT interface as “Thinking…”, but in the background, ChatGPT has developed a small number of searches it thinks might be helpful in collecting this desired information. It then plugs these queries into some unseen Google (or similar) search bar and crawls the results. It summarizes these results and boom, you have your answer.
AI engines like ChatGPT are information synthesizers. They love large amounts of data and large numbers of webpages. You could infer that ChatGPT might love to synthesize, say, 100 webpages.
See where I’m going?
ChatGPT, like the rank tracking tools mentioned above, was using the num=100 parameter to force a 100 result SERP it could synthesize to provide its users with a concise, trusted answer.
Google has the best webpage credibility ranking algorithm out there, so why reinvent the wheel? Instead of building its own content scoring and ranking algorithm, ChatGPT piggybacks off of classic search engines like Google.
The theory is that Google caught on to the fact that their competitor, one who is aggressively angling to take over the search industry, was taking advantage of their product and calling the results their own.
And so, with minimal impact to its users, but a whole lot of hubbub in the SEO community, it disabled the num=100 parameter.
How Does This Impact Me?
The most important question. All things considered, it doesn’t really change anything. Your site still performs just like it did before this rolled out. Users are still searching, and (hopefully) still seeing your site in the SERP.
But bots are now limited in what they can crawl. As mentioned before, rank tracking tools have limited their output to the top 20 results (or so) in order to avoid inflating their SERP crawl budgets.
At Synapse, we’ve always monitored the top 50 results for each keyword. Anything outside of the top 50, we labeled “Not in Top 50”, as we figured no one was really going deeper into the search results than that. What really is the value in knowing you’re ranked in position #99?
Today, we’ve adjusted that to be the top 20 results. And really, the logic still makes sense. If 99% of clicks happen in the top 20 results, then why consider anything ranking outside that threshold?
The biggest change you may notice is likely to be the impact on impressions, click-through rate, and average position in Search Console. Clicks will remain the same (since most bots aren’t clicking on organic listings), but these secondary metrics have shifted.
There’s a lot of change in the search marketing industry these days. We’ve developed a series of eBooks that dive deep into how AI is impacting search and how to use this change to your advantage. Download our AI eBooks for free today!
And if you’re interested in learning more about how this change is impacting your site’s performance profile, get in touch with us at sales@synapsesem.com.
