Is the Hype Around llms.txt Over? The Future Belongs to Agents
Until recently, the SEO and AI industry was buzzing about llms.txt. It was heralded as the new communication standard, designed to help Large Language Models (LLMs) easier “read” website content. However, technological reality is verifying these assumptions at a rapid pace. It turns out that static content availability is not enough. The real revolution is heading toward Agentic AI—autonomous agents that don’t just consume content but act on behalf of the user.
This direction was accurately predicted by Wojciech Urban, Senior SEO & R&D Specialist at our company. In his analysis earlier this year, he stated directly:
“If we were to look for a new standard or technical solution that would facilitate integration and cooperation between our site and AI tools, I would bet more on MCP – Model Context Protocol. (…) The idea of an
llms.txtfile doesn’t bring anything new; AI crawlers manage without it.”
Wojciech pointed out at the time that the future lies in using MCP to conduct entire purchasing processes via a chatbot, without the need to ever visit the store’s website. Today’s announcement from Google regarding WebMCP serves as a strong confirmation of this vision.
d-tags




