Resenya d’un altre article que el cita
Three days ago, Drew DeVault - founder and CEO of SourceHut - published a blogpost called, “Please stop externalizing your costs directly into my face”, where he complained that LLM companies were crawling data without respecting robosts.txt and causing severe outages to SourceHut.
LLM crawlers don’t respect robots.txt requirements and include expensive endpoints like git blame, every page of every git log, and every commit in your repository. They do so using random User-Agents from tens of thousands of IP addresses, each one making no more than one HTTP request, trying to blend in with user traffic.
Due to this, it’s hard to come off with a good set of mitigations.
You must log in or register to comment.