- cross-posted to:
- hackernews@lemmy.bestiver.se
- cross-posted to:
- hackernews@lemmy.bestiver.se
The one-liner:
dd if=/dev/zero bs=1G count=10 | gzip -c > 10GB.gz
This is brilliant.
The one-liner:
dd if=/dev/zero bs=1G count=10 | gzip -c > 10GB.gz
This is brilliant.
I mean i am not a lawyer.
In germany we have § 303 b StGB. In short it says if you hinder someone elses dataprocessing through physical means or malicous data you can go to jail for up to 3 years . If it is a major process for someone you can get up to 5 and in major cases up to 10 years.
So if you have a zipbomb on your system and a crawler reads and unpacks it you did two crimes. 1. You hindered that crawlers dataprocessing 2. Some isp nodes look into it and can crash too. If the isp is pissed of enough you can go to jail for 5 years. This applies even if you didnt crash them die to them having protection agsinst it, because trying it is also against the law.
Having a zipbomb is part of a gray area. Because trying to disrupt dataprocessing is illegal, having a zipbomb can be considered trying, however i am not aware of any judgement in this regard
Edit: btw if you password protect your zipbomb, everything is fine
I wonder if having a robots.txt file that said to ignore the file/path would help.
I’m assuming a bad bot would ignore the robots.txt file. So you could argue that you put up a clear sign and they chose to ignore it.
Severely disrupting other people’s data processing of significant import to them. By submitting malicious data requires intent to cause harm, physical destruction, deletion, etc, doesn’t. This is about crashing people’s payroll systems, ddosing, etc. Not burning some cpu cycles and having a crawler subprocess crash with OOM.
Why the hell would an ISP have a look at this. And even if, they’re professional enough to detect zip bombs. Which btw is why this whole thing is pointless anyway: If you class requests as malicious, just don’t serve them. If that’s not enough it’s much more sensible to go the anubis route and demand proof of work as that catches crawlers which come from a gazillion IPs with different user agents etc.