Get the latest tech news
AI bots strain Wikimedia as bandwidth surges 50%
Automated AI bots seeking training data threaten Wikipedia project stability, foundation says.
Wikimedia engineers quickly rerouted traffic to reduce congestion, but the event revealed a deeper problem: The baseline bandwidth had already been consumed largely by bots scraping media at scale. Unlike humans, who tend to view popular and frequently cached articles, bots crawl obscure and less-accessed pages, forcing Wikimedia’s core datacenters to serve them directly. Some even rotate through residential IP addresses to avoid blocking, tactics that have become common enough to force individual developers like Xe Iaso to adopt drastic protective measures for their code repositories.
Or read this on r/technology