Grokopedia, CTR drops and Google-CWS – SEO News – #1 – November 2025
d-tags
d-tags
When Elon Musk steps into a new game, the internet grabs the popcorn. This time, it’s an encyclopedia. In short: Grokopedia is Musk’s answer to Wikipedia. The difference? Instead of an army of volunteers, it’s powered by algorithms and the Grok model. The platform launched with version 0.1 and a package of automatically generated content.
Right from the start, it released nearly 900,000 pages online. That’s a pace at which even the busiest copywriter might start reconsidering their career. Although the project has only been live for a short while, search engines have already noticed it.
Grokopedia is a direct challenge to Wikipedia, but more importantly – a large-scale live test of how Google and Bing handle AI-generated content. According to available analyses, Google and Bing began indexing Grokopedia pages within the first week of launch. In Google, around 1,400 pages are indexed, while Bing has already indexed about 35,000! Is that a lot? For a project fresh out of the oven – absolutely.
Remember: being indexed doesn’t mean generating traffic. The first reports from Ahrefs are brutal for Grokopedia. Despite hundreds of thousands of published articles, the tool identified only:
Most websites today host their media assets – images, videos, or scripts – with external providers such as Amazon S3, Google Cloud Storage, or Microsoft Azure. It’s a standard practice. However, John Mueller from Google reminded administrators of a common mistake: giving up analytics and control over these resources.
If your images are displayed under the default cloud provider URL, from Google Search Console’s perspective, they are “invisible.” As a result, you may miss indexing errors, security warnings, or visibility statistics related to your graphics and multimedia.
All it takes is attaching your own subdomain, like media.yourdomain.com, pointing it with a CNAME record to the cloud resource, and verifying it in Search Console. In practice, it provides better insight into how Google sees your content and fewer surprises like “why did image traffic drop?”
What does it give us?
For months, the SEO community has focused on AI Overviews (AIO), treating them as the main “click thief” on search results pages. The logic seemed simple: if Google provides the answer itself, users don’t need to click. The latest study from Seer Interactive, however, shows that the issue is much deeper and more systemic.
The most alarming signal isn’t from queries containing AIO. The real shock comes from traditional results: the organic click-through rate (CTR) is dropping sharply, even when there’s no AI block on the SERP.
The Seer Interactive study (analyzing 25.1 million organic impressions for over 3,100 informational queries from June 2024 to September 2025) shows that:
.png?width=800&height=800&name=Paid%20%26%20Organic%20CTR%20Trends%20%E2%80%94%2012%20Months%20(Oct%E2%80%9924-Sep%E2%80%9925).png)
Source: https://www.seerinteractive.com/insights/aio-impact-on-google-ctr-september-2025-update
It turns out that even if you manage to secure a high ranking and Google doesn’t display an AI answer, users are still much less likely to click your link than they were a year ago.
If you closely analyze your server logs, you may have recently noticed a new visitor. Google-CWS is a “user-triggered fetcher,” a fetching module activated by user actions. Its sole purpose is to verify resources submitted by developers in the Chrome Web Store admin panel.
When a developer submits a new extension or theme to the Chrome Web Store, they must fill out several metadata fields. Google-CWS is the tool that automatically “visits” and verifies these URLs. These include:
Google’s goal is to verify whether these links are active and not leading to phishing pages or malware. It’s an administrative and security measure to protect the CWS ecosystem.
For SEO specialists and administrators, the main technical implication is that Google-CWS, like other “user-triggered” modules (e.g., Google Site Verifier), usually ignores directives in the robots.txt file.
If a developer intentionally provided a Privacy Policy URL, Google assumes the intention is to allow it to verify that address. Blocking it in robots.txt would contradict the developer’s own submission in the Chrome Web Store.