The article describes Anubis, a proof-of-work system designed to protect websites from AI company scraping by implementing Hashcash-style challenges. It serves as an interim solution while more sophisticated browser fingerprinting techniques are developed to distinguish legitimate users from automated scrapers. The system requires JavaScript and represents a response to the changing dynamics of web scraping by AI companies.
Background
AI companies increasingly scrape web content at scale, causing server strain and downtime for website operators. This has led to the development of various anti-scraping measures to balance accessibility with protection against automated data collection.
- Source
- Lobsters
- Published
- Mar 28, 2026 at 02:38 AM
- Score
- 5.0 / 10