Get the latest tech news

AI bots strain Wikimedia as bandwidth surges 50%


Automated AI bots seeking training data threaten Wikipedia project stability, foundation says.

Wikimedia engineers quickly rerouted traffic to reduce congestion, but the event revealed a deeper problem: The baseline bandwidth had already been consumed largely by bots scraping media at scale. Unlike humans, who tend to view popular and frequently cached articles, bots crawl obscure and less-accessed pages, forcing Wikimedia’s core datacenters to serve them directly. Some even rotate through residential IP addresses to avoid blocking, tactics that have become common enough to force individual developers like Xe Iaso to adopt drastic protective measures for their code repositories.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Wikipedia

Wikipedia

Photo of foundation

foundation

Photo of AI bots

AI bots

Related news:

News photo

Wikipedia is struggling with voracious AI bot crawlers

News photo

How crawlers impact the operations of the Wikimedia projects

News photo

Amateur photographers hope to fix Wikipedia's 'terrible' pictures