this post was submitted on 26 Nov 2025
413 points (96.8% liked)

Selfhosted

53386 readers
121 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Got a warning for my blog going over 100GB in bandwidth this month... which sounded incredibly unusual. My blog is text and a couple images and I haven't posted anything to it in ages... like how would that even be possible?

Turns out it's possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for 'Unknown robot'? This is actually bonkers.

Edit: As Thunraz points out below, there's a footnote that reads "Numbers after + are successful hits on 'robots.txt' files" and not scientific notation.

Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That's when my account was suspended for exceeding bandwidth (it's an artificial limit I put on there awhile back and forgot about...) that's also why the 'last visit' for all the bots is November 12th.

top 50 comments
sorted by: hot top controversial new old
[–] hoshikarakitaridia@lemmy.world 152 points 1 week ago (2 children)

Fucking hell.

Yeah and that's why people are using cloudflare so much.

[–] artyom@piefed.social 124 points 1 week ago (2 children)

One corporation DDOS's your server to death so that you need the other corporations' protection.

[–] MaggiWuerze@feddit.org 84 points 1 week ago

basically protection racket

[–] muffedtrims@lemmy.world 29 points 1 week ago (1 children)

That's a nice website you gots there, would be ashame if something weres to happen to it.

[–] Agent641@lemmy.world 7 points 1 week ago (1 children)

We accidentally the whole config file

load more comments (1 replies)
[–] Lee@retrolemmy.com 60 points 1 week ago (1 children)

A friend (works in IT, but asks me about server related things) of a friend (not in tech at all) has an incredibility low traffic niche forum. It was running really slow (on shared hosting) due to bots. The forum software counts unique visitors per 15 mins and it was about 15k/15 mins for over a week. I told him to add Cloudflare. It dropped to about 6k/15 mins. We excitemented turning Cloudflare off/on and it was pretty consistent. So then I put Anubis on a server I have and they pointed the domain to my server. Traffic drops to less than 10/15 mins. I've been experimenting with toggling on/off Anubis/Cloudflare for a couple months now with this forum. I have no idea how the bots haven't scrapped all of the content by now.

TLDR: in my single isolated test, Cloudflare blocks 60% of crawlers. Anubis blocks presumably all of them.

Also if anyone active on Lemmy runs a low traffic personal site and doesn't know how or can't run Anubis (eg shared hosting), I have plenty of excess resources I can run Anubis for you off one of my servers (in a data center) at no charge (probably should have some language about it not being perpetual, I have the right to terminate without cause for any reason and without notice, no SLA, etc). Be aware that it does mean HTTPS is terminated at my Anubis instance, so I could log/monitor your traffic if I wanted as well, so that's a risk you should be aware of.

[–] MinFapper@startrek.website 16 points 1 week ago (2 children)

It's interesting that anubis has worked so well for you in practice.

What do you think of this guy's take?

https://lock.cmpxchg8b.com/anubis.html

[–] pipe01@programming.dev 7 points 1 week ago (1 children)

I wouldn't be surprised if most bots just don't run any JavaScript so the check always fails

load more comments (1 replies)
load more comments (1 replies)
[–] slazer2au@lemmy.world 103 points 1 week ago (1 children)

AI scrapers are the new internet DDoS.

Might want to throw something Infront of your blog to ward them off like Anubis or a Tarpit.

[–] Eyekaytee@aussie.zone 33 points 1 week ago* (last edited 1 week ago) (2 children)

the one with the quadrillion hits is this bad boy: https://www.babbar.tech/crawler

Babbar.tech is operating a crawler service named Barkrowler which fuels and update our graph representation of the world wide web. This database and all the metrics we compute with are used to provide a set of online marketing and referencing tools for the SEO community.

[–] porcoesphino@mander.xyz 8 points 1 week ago (1 children)
[–] Jessica@discuss.tchncs.de 11 points 1 week ago (1 children)

It's a quote from the website

[–] Vorpal@programming.dev 11 points 1 week ago (2 children)

It is common custom to indicate quotes, with either "quotes" or for a longer quote a

block quote

The latter can be done by prefixing the line with a > here on lemmy (uses the common markdown syntax).

Doing either of this help avoid ambiguity.

[–] Jessica@discuss.tchncs.de 6 points 1 week ago

You replied to the wrong person. I already know this, but clearly the person who posted the quote doesn't ;)

[–] porcoesphino@mander.xyz 5 points 1 week ago

Thanks the taking the time. I always find it hard to follow up and point out the ambiguity / alternative without coming across in some unwelcome way

load more comments (1 replies)
[–] dual_sport_dork@lemmy.world 71 points 1 week ago (2 children)

I run an ecommerce site and lately they've latched onto one very specific product with attempts to hammer its page and any of those branching from it for no readily identifiable reason, at the rate of several hundred times every second. I found out pretty quickly, because suddenly our view stats for that page in particular rocketed into the millions.

I had to insert a little script to IP ban these fuckers, which kicks in if I see a malformed user agent string or if you try to hit this page specifically more than 100 times. Through this I discovered that the requests are coming from hundreds of thousands of individual random IP addresses, many of which are located in Singapore, Brazil, and India, and mostly resolve down into those owned by local ISPs and cell phone carriers.

Of course they ignore your robots.txt as well. This smells like some kind of botnet thing to me.

[–] panda_abyss@lemmy.ca 21 points 1 week ago (2 children)

I don’t really get those bots.

Like, there are bots that are trying to scrape product info, or prices, or scan for quantity fields. But why the hell do some of these bots behave the way they do?

Do you use Shopify by chance? With Shopify the bots could be scraping the product.json endpoint unless it’s disabled in your theme. Shopify just seems to show the updated at timestamp from the db in their headers+product data, so inventory quantity changes actually result in a timestamp change that can be used to estimate your sales.

There are companies that do that and sell sales numbers to competitors.

No idea why they have inventory info on their products table, it’s probably a performance optimization.

I haven’t really done much scraping work in a while, not since before these new stupid scrapers started proliferating.

[–] dual_sport_dork@lemmy.world 21 points 1 week ago (9 children)

Negative. Our solution is completely home grown. All artisinal-like, from scratch. I can't imagine I reveal anything anyone would care about much except product specs, and our inventory and pricing really doesn't change very frequently.

Even so, you think someone bothering to run a botnet to hound our site would distribute page loads across all of our products, right? Not just one. It's nonsensical.

[–] panda_abyss@lemmy.ca 11 points 1 week ago

Yeah, that’s the kind of weird shit I don’t understand. Someone on the other hand is paying for servers and a residential proxy to send that traffic too. Why?

load more comments (8 replies)
load more comments (1 replies)
[–] billygoat@catata.fish 5 points 1 week ago

I see the same thing but hitting my lemmy instance. Not much you can do other than start up banning or geoip banning.

[–] Thunraz@feddit.org 49 points 1 week ago (1 children)

It's 12181 hits and the number behind the plus sign are robots.txt hits. See the footnote at the bottom of your screenshot.

[–] benagain@lemmy.ml 20 points 1 week ago (2 children)

Phew, so I'm a dumbass and not reading it right. I wonder how they've managed to use 3MB per visit?

[–] arandomthought@sh.itjust.works 16 points 1 week ago (1 children)

The robots are a problem, but luckily we're not into the hepamegaquintogilarillions... Yet.

[–] benagain@lemmy.ml 9 points 1 week ago* (last edited 1 week ago)

12,000 visits, with 181 of those to the robots.txt file makes way, way more sense. The 'Not viewed traffic' adds up to 136,957 too - so I should have figured it out sooner.

I couldn't wrap my head around how large the number was and how many visits that would actually entail to reach that number in 25 days. Turns out that would be roughly 5.64 quinquinquagintillion visits per nanosecond. Call it a hunch, but I suspect my server might not handle that.

[–] EarMaster@lemmy.world 4 points 1 week ago* (last edited 1 week ago) (1 children)

The traffic is really suspicious. Have you by any chance a health or heartbeat endpoint which provides continuous output? That would explain why so little hits cause so much traffic.

load more comments (1 replies)
[–] carrylex@lemmy.world 28 points 1 week ago* (last edited 1 week ago) (1 children)
  1. Get a blocklist
  2. Enable rate limits
  3. Get a proper robots.txt
  4. ~~Profit~~ Silence
[–] Cort@lemmy.world 9 points 1 week ago (2 children)

Can you just turn the robots.txt into a click wrap agreement to charge robots high fees for access above a certain threshold?

load more comments (2 replies)
[–] BurnedDonutHole@ani.social 27 points 1 week ago (1 children)

You can also use crowdsec on your server to stop similar BS. They use a community based blacklist. You choose what you want to block. Check it out.

https://github.com/crowdsecurity/crowdsec

[–] jjlinux@lemmy.zip 5 points 1 week ago (6 children)

I'm going to try and implement crowdsec for all my ProxMox containers over Cloudflare tunnels. Wish me luck and that my wife and kids let me do this without constantly making shot up fore to do.

load more comments (6 replies)
[–] scrubbles@poptalk.scrubbles.tech 27 points 1 week ago (3 children)

Check out Anubis. If you have a reverse proxy it is very easy to add, and for the bots stopped spamming after I added it to mine

[–] Object@sh.itjust.works 6 points 1 week ago

I also recommend it.

lol

[–] K3can@lemmy.radio 5 points 1 week ago (1 children)

I recently added Anubis and its validation rate is under 40%. In other words, 60% of the incoming requests are likely bots and are now getting blocked. Definitely recommend.

load more comments (1 replies)
[–] MinFapper@startrek.website 4 points 1 week ago (1 children)

It's interesting that anubis has worked so well for you in practice.

What do you think of this guy's take?

https://lock.cmpxchg8b.com/anubis.html

[–] scrubbles@poptalk.scrubbles.tech 14 points 1 week ago (7 children)

This dance to get access is just a minor annoyance for me, but I question how it proves I’m not a bot. These steps can be trivially and cheaply automated.

I don't think the author understands the point of Anubis. The point isn't to block bots completely from your site, bots can still get in. The point is to put up a problem at the door to the site. This problem, as the author states, is relatively trivial for the average device to solve, it's meant to be solved by a phone or any consumer device.

The actual protection mechanism is scale, the scale of this solving solution is costly. Bot farms aren't one single host or machine, they're thousands, tens of thousands of VMs running in clusters constantly trying to scrape sites. So to them, a calculating something that trivial is simple once, very very costly at scale. Say calculating the hash once takes about 5 seconds. Easy for a phone. Let's say that's 1000 scrapes of your site, that's now 5000 seconds to scrape, roughly an hour and a half. Now we're talking about real dollars and cents lost. Scraping does have a cost, and having worked at a company that does professionally scrape content they know this. Most companies will back off after trying to load a page that takes too long, or is too intensive - and that is why we see the dropoff in bot attacks. It's that it's not worth it for them to scrape the site anymore.

So for Anubis they're "judging your value" by saying "Are you willing to put your money where your mouth is to access this site?" For consumer it's a fraction of a fraction of a penny in electricity spent for that one page load, barely noticeable. For large bot farms it's real dollars wasted on my little lemmy instance/blog, and thankfully they've stopped caring.

load more comments (7 replies)
[–] phoenixz@lemmy.ca 21 points 1 week ago

AI bots killing the internet again? You don't say

[–] pendel@feddit.org 16 points 1 week ago

I had to pull an all nighter to fix some unoptimized query because I had just launched a new website with barely any visitors and hadn’t implemented caching yet for something that I thought no one uses anyway, but a bot found it and broke my entire DB through hitting the endpoint again and again until nothing worked anymore

[–] irmadlad@lemmy.world 14 points 1 week ago

Unknown Robot is your biggest fan.

[–] MystikIncarnate@lemmy.ca 10 points 1 week ago

fracking clankers.

[–] WolfLink@sh.itjust.works 8 points 1 week ago (1 children)

This is why I use CloudFlare. They block the worst and cache for me to reduce the load of the rest. It’s not 100% but it does help.

[–] irmadlad@lemmy.world 4 points 1 week ago

LOL Someone took exception to your use of Cloudflare. Hilarious. Anyways, yeah, what Cloudflare doesn't get, pFsense does.

[–] Eyekaytee@aussie.zone 8 points 1 week ago

does your blog have a blackhole in it somewhere you forgot about 😄

[–] Vorpal@programming.dev 5 points 1 week ago (1 children)

What is that log analysis tool you are using in the picture? Looks pretty neat.

[–] benagain@lemmy.ml 4 points 1 week ago (3 children)

It's a mix, I put two screenshots together. On the left is my monthly bandwidth usage from CPanel on the right is Awstats (though I hid some sections so the Robots/Spiders section was closer to the top).

load more comments (3 replies)
[–] KeenFlame@feddit.nu 5 points 1 week ago (4 children)

What is the blog about? It may be increased interest as search providers use them for normal searches now.. or it could be a couple of already sentient doombots.

Please don't be a blog about von Neumann probes. Please don't be a blog about von Neumann probes. Please don't be a blog about von Neumann probes..

load more comments (4 replies)
[–] ohshit604@sh.itjust.works 5 points 1 week ago

I just geo-restrict my server to my country, certain services I’ll run an ip-blacklist and only whitelist the known few networks.

Works okay I suppose, kills the need for a WAF, haven’t had any issues with it.

[–] biggerbogboy@sh.itjust.works 3 points 1 week ago

Hydrogen bomb vs coughing baby type shit

load more comments
view more: next ›