Hi,
I have an webserver running which provides an archive of binaries which is quite big. Now from time to time a bad web robot passes by pulling ALL these binaries completely without any speed limit causing a traffic of several dozen GBytes within a few minutes. The bot itself can't be identified as it does not come with a specific client identification, it looks like just a stupid browser without any additional information.
So my only idea to defend from this bot, is to limit the traffic. Means I'm looking for a solution to define a maximum traffic a single IP is allowed to pull from my server within a given time frame.
Any idea how this can be done? Is there a possibility in Apache to define such traffic limits? Or perhaps somewhere within fail2ban?
When searching the web for this problem, I only stumble upon solutions that limit the bandwidth (aka download speed), but this is NOT what I want to do.
Thank you!