The internet traffic is increasingly being driven by bots and artificial intelligence (AI) agents. A report by cybersecurity firm HUMAN Security has revealed that in 2025, automated internet traffic grew 23.51% year-on-year (YoY), almost eight times faster than human traffic, which rose just 3.10%.
Automated traffic refers to all non-human visits to websites and apps generated by software programmes (bots) rather than direct human action, with AI-driven traffic defined as visits carried out by AI systems, including AI agents.
This comes on the back of the rise of generative AI (GenAI) assistants such as ChatGPT, Claude, and more recently OpenClaw’s Clawbot, which extensively rely on continuous information access. AI-driven traffic increased 187% between January and December last year. The report states that with the rise of these assistants, bot traffic will decisively surpass human activity by 2027, marking a fundamental transformation in how the internet operates.
The report revealed that OpenAI’s bots, including ChatGPT User, OAI-SearchBot, GPTBot, and ChatGPT Agent, accounted for roughly 69% of all AI-driven traffic in 2025. Meta contributed about 16% through Meta-ExternalAgent, while Anthropic accounted for around 11% via ClaudeBot and Claude-SearchBot. This means that a few organisations shape the overall traffic exposed to AI agents.
The report adds that the traffic is more concentrated in industries such as retail, media, and travel, dominated by a few operators, and still largely driven by training crawlers, though this is changing.
Further, the composition of this AI-driven traffic is also changing. While at the start of 2025, training crawlers made up about 90% of such traffic and real-time scrapers 10%, by December, real-time scrapers rose to 24%, and a new agentic category capable of autonomous actions emerged at 1.7%.
Training crawlers are those used to collect data for machine learning models. Crawlers are automated programmes that browse the internet to discover, scan, and index web content. Indexing enables making websites searchable and accessible to users. So, while web crawlers discover and map entire websites over a period of time, real-time scrapers instantly extract specific, targeted data points for immediate queries.
Meanwhile, AI agents are autonomous systems that reason, plan, and take multi-step actions to accomplish tasks, whereas both scrapers and crawlers are passive tools.
Also Read: ETtech Explainer: Here’s all you need to know about AI agents and their evolving capabilities
This means that the nature of automated traffic in 2025 has moved on from just gathering information to actively engaging in commerce. These agents are now capable of creating accounts, managing sessions, and completing transactions, introducing financial and contractual implications.
This kind of traffic is a matter of concern because of the massive infrastructure strain it may cause on the overall internet, leading to a slowdown of websites for legitimate users, and the possibility of increased malicious activities like data scraping, account takeovers, and fraudulent ad clicks to take place. This means that the internet is shifting toward a model where machines increasingly act on behalf of users, requiring new frameworks of trust and governance.
Also Read: Why are founders flocking to OpenClaw?
Automated traffic refers to all non-human visits to websites and apps generated by software programmes (bots) rather than direct human action, with AI-driven traffic defined as visits carried out by AI systems, including AI agents.
This comes on the back of the rise of generative AI (GenAI) assistants such as ChatGPT, Claude, and more recently OpenClaw’s Clawbot, which extensively rely on continuous information access. AI-driven traffic increased 187% between January and December last year. The report states that with the rise of these assistants, bot traffic will decisively surpass human activity by 2027, marking a fundamental transformation in how the internet operates.
The report revealed that OpenAI’s bots, including ChatGPT User, OAI-SearchBot, GPTBot, and ChatGPT Agent, accounted for roughly 69% of all AI-driven traffic in 2025. Meta contributed about 16% through Meta-ExternalAgent, while Anthropic accounted for around 11% via ClaudeBot and Claude-SearchBot. This means that a few organisations shape the overall traffic exposed to AI agents.
The report adds that the traffic is more concentrated in industries such as retail, media, and travel, dominated by a few operators, and still largely driven by training crawlers, though this is changing.
Further, the composition of this AI-driven traffic is also changing. While at the start of 2025, training crawlers made up about 90% of such traffic and real-time scrapers 10%, by December, real-time scrapers rose to 24%, and a new agentic category capable of autonomous actions emerged at 1.7%.
Training crawlers are those used to collect data for machine learning models. Crawlers are automated programmes that browse the internet to discover, scan, and index web content. Indexing enables making websites searchable and accessible to users. So, while web crawlers discover and map entire websites over a period of time, real-time scrapers instantly extract specific, targeted data points for immediate queries.
Meanwhile, AI agents are autonomous systems that reason, plan, and take multi-step actions to accomplish tasks, whereas both scrapers and crawlers are passive tools.
Also Read: ETtech Explainer: Here’s all you need to know about AI agents and their evolving capabilities
This means that the nature of automated traffic in 2025 has moved on from just gathering information to actively engaging in commerce. These agents are now capable of creating accounts, managing sessions, and completing transactions, introducing financial and contractual implications.
This kind of traffic is a matter of concern because of the massive infrastructure strain it may cause on the overall internet, leading to a slowdown of websites for legitimate users, and the possibility of increased malicious activities like data scraping, account takeovers, and fraudulent ad clicks to take place. This means that the internet is shifting toward a model where machines increasingly act on behalf of users, requiring new frameworks of trust and governance.
Also Read: Why are founders flocking to OpenClaw?