Scrubbing logs for bad IPs
Periodically, log files should be scrubbed for bad bots and malicious IPs. Let’s do that.
This will work with any log file in which you have one IP address per line.
The first thing we want to do is pull out all the IP addresses in our log file. Let’s go ahead and do that for lawsonry’s log file right now:
grep -E -o '(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4]
[0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)
\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)' lawsonry.dashingwp.com.access.log
>> ipslist.txt
Make sure you stick all that on a single line!
Unfortunately, if you cat ipslist.txt
you’ll see a bunch of duplicate entries. To get rid of them, let’s sort the file:
sort -u ipslist.txt >> ipslist
Now remove our old file with rm -rf ipslist.txt
because we don’t need it anymore.
Go ahead and cat ipslist
to see the list of unique IPs.
With this, you can use the methods described at the end of step 5 of this tutorial, in which we’ll create a custom php script to scrub through the IPs and automatically print us out some deny from
entries formatted for Nginx.
Alternatively, you could go straight to iptables (which is what I do) and just block them at the server level:
BLOCK_THIS_IP="x.x.x.x"
iptables -A INPUT -s "$BLOCK_THIS_IP" -j DROP