3 comments on “Web-site Security: blocking user agents, requests & query strings

  1. Great write-up, Andy. Many good points and some smart strategies that I’ve taken the liberty of integrating into the next generation of the g-series blacklist (6G). Totally agree that blocking by IP is a big waste of time, and would add that blocking by user-agent and referrer is becoming equally futile. There are some well-known strings to look for, but it’s trivial these days to spoof just about everything except for the actual request string, which as you’ve explained includes numerous variables such as query string, request method, and so on. Will be mentioning this article in the upcoming “6G-beta” post. Cheers!

  2. Thanks for the positive feedback Jeff, I’m looking forward to your 6G (and other Perishablepress articles).

    LOL: to quote from my own article “rules may be too strict for some sites e.g. blocking requests like “http://yoursite.com/logout?redir=http: //yoursite.com/login.php”.

    I have found WordPress uses a very similar “GET” request if you are not signed on. So when I tried to approve your comment from the link in WordPress’s “Please Moderate” email the request was 403′d.

    Everything works fine if you sign on in the usual manner and navigate to comments via the dashboard – so I WON’T be removing the block of “//”.

  3. A beta version of the 6G firewall has now been published at http://perishablepress.com/6g-beta/

Your Comments & Questions

email address is optional and is not displayed