kopasz7

joined 8 months ago
[โ€“] kopasz7@sh.itjust.works 24 points 1 day ago (1 children)

Search engines been going relatively fine for decades now. But the crawlers from AI companies basically DDOS hosts in comparison, sending so many requests in such a short interval. Crawling dynamic links as well that are expensive to render compared to a static page, ignoring the robots.txt entirely, or even using it discover unlinked pages.

Servers have finite resources, especially self hosted sites, while AI companies have disproportinately more at their disposal, easily grinding other systems to a halt by overwhelming them with requests.

[โ€“] kopasz7@sh.itjust.works 11 points 2 days ago

The ultimate validation is to see if it gets sent.