Get the latest tech news
Show HN: An experimental AntiBot, AntiCrawl reverse proxy for the web
An experimental AntiBot, AntiCrawl reverse proxy for serving simple static content. - pulkitsharma07/OnlyHumans-Proxy
incurs an unnecessary cognitive load on humans (more so on those who do not use "popular" browsers) Requires dependency on paid providers like Cloudflare, DataDome.. Crawling As a Service is on the rise, LLM models are being freely trained on the text data of the web...No one honours robots.txt anymore, we need better approaches to control who gets access to human generated content. So comment sections, or other fancy JS interactivity stuff can be done using regular web frameworks, but while serving any human generated content you can use an iframe which points to your OnlyHumans instance.
Or read this on Hacker News