Go Back   Admin Zone Forums > The Community Zone > Managing an Online Community > Site Security

Site Security Keeping Your Community Safe from Hackers and Other Unwelcome Visitors.

Reply
 
Thread Tools

  #1  
Old 12-29-2009, 12:56 AM
Anonymous's Avatar
Anonymous Anonymous is offline
Shhh...
 
Join Date: Apr 2004
Posts: 960
Anonymous has a spectacular aura about
Default How to Stop Unknown Robots from crawling my website?
Unknown robots drastically eating my website's bandwidth unnecessarily.
I could not be able to identify them. Because they are crawling the website by without any of its details like name, IP address etc. While they crawling the only thing I know is "Unknown robot crawling". If I know the IP address, I can stop them by IP blocking.
And the Unknown robots may not required to obey robot.txt standards or else other techniques like "no follow" attribute.
They will just crawl websites for all the website links available on the web without user authentication.
These robots not required to crawl websites and there is no use of them.
How to stop them in a efficient way?
Reply With Quote
  #2  
Old 12-29-2009, 06:25 AM
EthanJ's Avatar
EthanJ EthanJ is offline
La Belle Époque
 
Real Name: Ethan
Join Date: May 2009
Admin Experience: Guru
Location: London, UK
Posts: 771
EthanJ is just really niceEthanJ is just really nice
Default
Most robots identify themselves by a custom user agent in the request headers. Which can easily be blocked with htaccess.

There are a number of good articles on this, like this one or this one. Let me know if you have any problems, as it's a matter of identifying the offending bots/crawlers and banning them as per your need.
__________________
Wordpress Medusa | Private beta starting soon.
An opensource app to backup, update, manage multiple Wordpress installs from one integrated admin interface.
Can be fully automated. Great for developer maintenance contracts.


ContraFlo | Final production build available Q2.
A high load capable AJAX based media streaming CMS; out of the box CDN integration and hardware load balancing; tiered license pricing; whitelabel hosting available.
Reply With Quote
  #3  
Old 06-15-2010, 10:25 AM
Ouisri Ouisri is offline
TAZ Rookie
 
Join Date: Jun 2010
Posts: 2
Ouisri is on a distinguished road
Default
You have choice of block them and improve you site engine. Both need modification on your html-code. EthanJ guide you how to block them, I will guide you how to improve sitengine in order to run faster such as replace old some old date function like "ereg" to "preg".
Reply With Quote
  #4  
Old 06-15-2010, 11:20 AM
blahblahblah2 blahblahblah2 is offline
Tazmanian
 
Real Name: dood
Join Date: Apr 2010
Location: Im from uranus
Posts: 287
blahblahblah2 will become famous soon enough
Default
http://www.youfoundjake.com/2007/12/...up-a-bot-trap/ Just food for thought. Its from 07, dont even know if it works on current dy crawlies etc. EthanJ ... where do you stand on the use of bottraps, .... been wondering what your opinion would be ?

Last edited by blahblahblah2; 06-15-2010 at 11:26 AM..
Reply With Quote
  #5  
Old 06-15-2010, 12:49 PM
EthanJ's Avatar
EthanJ EthanJ is offline
La Belle Époque
 
Real Name: Ethan
Join Date: May 2009
Admin Experience: Guru
Location: London, UK
Posts: 771
EthanJ is just really niceEthanJ is just really nice
Default
I think they're a tiny bit unnecessary and flat out useless in most cases. The central idea behind a bot trap is that you trust self declared user-agent strings, and no self respecting malicious bot is going to declare itself as anything other than that of a trusted user agent (like Google Bot or even a regular browser). In other words, it's not likely a shoplifter will walk into a store with a huge 'I'M HERE TO STEAL THINGS' sign is it? Which undermines the entire point of the exercise.

So in practice all you're doing is banning lesser known generally non-malicious bots. If you want to get rid of 90%+ of scrapers (which is what the vast majority of bots do that can detected by user agent string) a static list in your htaccess file is going to do a good enough job.

To be entirely honest though, doing any kind of security work whatsoever based on user agent string is a fools errand.
__________________
Wordpress Medusa | Private beta starting soon.
An opensource app to backup, update, manage multiple Wordpress installs from one integrated admin interface.
Can be fully automated. Great for developer maintenance contracts.


ContraFlo | Final production build available Q2.
A high load capable AJAX based media streaming CMS; out of the box CDN integration and hardware load balancing; tiered license pricing; whitelabel hosting available.
Reply With Quote
  #6  
Old 06-15-2010, 02:34 PM
blahblahblah2 blahblahblah2 is offline
Tazmanian
 
Real Name: dood
Join Date: Apr 2010
Location: Im from uranus
Posts: 287
blahblahblah2 will become famous soon enough
Default
Kay was just wondering ... you seem alot more up on these matters than I am. So was curious about your opinion on the topic.
Reply With Quote
  #7  
Old 06-19-2010, 09:19 PM
inenigma's Avatar
inenigma inenigma is offline
TAZ Rookie
 
Real Name: David
Join Date: Apr 2010
Admin Experience: Beginner
Location: Hobart, Tasmania, Oz
Posts: 26
inenigma is on a distinguished road
Default Been trying to block Yandex
Hi,

I've got this bot "spider88.yandex.ru" chewing thru my bandwidth. I've run the DNS thru whois on cqcounter.com and have picked out the following IP's:

77.88.19.60, 77.88.21.11, 87.250.251.11, 93.158.134.11, 213.180.204.11, 213.180.204.211, 213.180.193.1, 213.180.199.34, 213.180.204.1

I've updated my .htaccess to include the following which I thought would've blocked any access from the Yandex spider, but, the sod's still getting thru ??

# block Yandex
<IfModule mod_rewrite.c>
RewriteEngine On

RewriteCond %{REMOTE_ADDR} ^77\.88\.19\.$
RewriteCond %{REMOTE_ADDR} ^77\.88\.21\.$
RewriteCond %{REMOTE_ADDR} ^87\.250\.251\.$
RewriteCond %{REMOTE_ADDR} ^87\.250\.252\.$
RewriteCond %{REMOTE_ADDR} ^93\.158\.134\.$
RewriteCond %{REMOTE_ADDR} ^213\.180\.193\.$
RewriteCond %{REMOTE_ADDR} ^213\.180\.199\.$
RewriteCond %{REMOTE_ADDR} ^213\.180\.204\.$
RewriteRule ^(.*)$ - [F,L]
</IfModule>


First, are my rules in the .htaccess correct ?? Second, if they are correct (which I think they are), how do I identify what IP address it's coming thru on if whois isn't telling the correct IP ??? Also, is there any way I can find out what requests Yandex are actually submitting so I can try one of the other ways to Blacklist ??

Any help is greatly appreciated.

Thanks,
David


Uh, got a hold of my raw access logs and picked out the IP address that the bot was running from. I'll see tomorrow if it gets blocked....

Last edited by inenigma; 06-20-2010 at 03:16 AM..
Reply With Quote
  #8  
Old 06-21-2010, 05:35 AM
inenigma's Avatar
inenigma inenigma is offline
TAZ Rookie
 
Real Name: David
Join Date: Apr 2010
Admin Experience: Beginner
Location: Hobart, Tasmania, Oz
Posts: 26
inenigma is on a distinguished road
Default
Hi,

It looks like I've pissed them off as they are now pinging my site even more now.

These are only some of the messages from my raw access log:

95.108.247.252 - - [21/Jun/2010:19:18:37 +1000] "GET /USM/calendar.php?do=getinfo&day=2010-4-28&c=1 HTTP/1.1" 200 4541 "-" "Yandex/1.01.001 (compatible; Win16; I)"
95.108.247.252 - - [21/Jun/2010:19:19:20 +1000] "GET /USM/calendar.php?do=getinfo&day=2010-3-16&c=1 HTTP/1.1" 200 4540 "-" "Yandex/1.01.001 (compatible; Win16; I)"
95.108.247.252 - - [21/Jun/2010:19:20:48 +1000] "GET /USM/calendar.php?do=getinfo&day=2013-4-8&c=1 HTTP/1.1" 200 4540 "-" "Yandex/1.01.001 (compatible; Win16; I)"
95.108.247.252 - - [21/Jun/2010:19:21:31 +1000] "GET /USM/calendar.php?do=getinfo&day=2007-10-11&c=1 HTTP/1.1" 200 4544 "-" "Yandex/1.01.001 (compatible; Win16; I)"


This is the part of my .htaccess file that's supposed to block Yandex spiders

# block Yandex and redirect the request back to their own site.
<IfModule mod_rewrite.c>
RewriteEngine On

RewriteCond %{REMOTE_ADDR} ^77\.88\.19\.$
RewriteCond %{REMOTE_ADDR} ^77\.88\.21\.$
RewriteCond %{REMOTE_ADDR} ^87\.250\.251\.$
RewriteCond %{REMOTE_ADDR} ^87\.250\.252\.$
RewriteCond %{REMOTE_ADDR} ^93\.158\.134\.$
RewriteCond %{REMOTE_ADDR} ^95\.108\.247\.$
RewriteCond %{REMOTE_ADDR} ^95\.108\.247\.252$
RewriteCond %{REMOTE_ADDR} ^213\.180\.193\.$
RewriteCond %{REMOTE_ADDR} ^213\.180\.199\.$
RewriteCond %{REMOTE_ADDR} ^213\.180\.204\.$
RewriteRule ^(.*)$ - [F,L]
</IfModule>


Help !!!!

Sorry for the thread hijack... I didn't think anyone would mind seeing as it had been dead since last Dec.
Reply With Quote
  #9  
Old 07-01-2010, 09:56 PM
nick47274 nick47274 is offline
TAZ Rookie
 
Real Name: ???
Join Date: Jul 2010
Admin Experience: Guru
Posts: 5
nick47274 is on a distinguished road
Default
I'm not good at Server side stuff, but I think it would be:

deny (IP)

I'm sure if you search this in Google, you could find some good info to help you
Reply With Quote
  #10  
Old 07-02-2010, 02:34 AM
inenigma's Avatar
inenigma inenigma is offline
TAZ Rookie
 
Real Name: David
Join Date: Apr 2010
Admin Experience: Beginner
Location: Hobart, Tasmania, Oz
Posts: 26
inenigma is on a distinguished road
Default
Yeah, put that in as well, but, the sods kept getting thru ? Eventually the OPs placed a server wide block (? anyone know what this is ?) on the IP address and that finally stopped them.

Cheers,
David
Reply With Quote
  #11  
Old 11-26-2010, 05:47 PM
mongo50 mongo50 is offline
TAZ Rookie
 
Real Name: ???
Join Date: Nov 2010
Posts: 5
mongo50 is on a distinguished road
Default Unknown robot (identified by 'crawl')
Quote:
Originally Posted by EthanJ View Post
Most robots identify themselves by a custom user agent in the request headers. Which can easily be blocked with htaccess.
Three of the five top bots in one of my domains show up in AWSTATs as "Unknown robot (identified by 'bot/' or 'bot-')" and "Unknown robot (identified by 'crawl')" and "Unknown robot (identified by 'robot')". Interestingly, they don't show up at all in WebLog Expert's parsing of the same logs.

I'd like to block these spiders. Is there any way to do this? Thanks for your help...
Reply With Quote
  #12  
Old 12-15-2010, 07:24 AM
free1proxy free1proxy is offline
TAZ Rookie
 
Join Date: Dec 2010
Posts: 13
free1proxy is on a distinguished road
Default
You need to make use of htaccess to block the unkown robots aways from the website
Reply With Quote
  #13  
Old 12-15-2010, 07:29 AM
Adam H's Avatar
Adam H Adam H is offline
Think before you speak
 
Real Name: Adam
Join Date: Jun 2008
Admin Experience: Guru
Location: UK
Age: 31
Posts: 957
Adam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant futureAdam H has a brilliant future
Default
I currently use this on all my sites , basically it blocks all bad user agents , bad bots and scrappers, Not only can it save your content from being mass harvested but will also save you a little bandwidth because of less bots running around your site. Hope it helps

Quote:
SetEnvIfNoCase User-Agent "^Black Hole" bad_bot
SetEnvIfNoCase User-Agent "^Titan" bad_bot
SetEnvIfNoCase User-Agent "^WebStripper" bad_bot
SetEnvIfNoCase User-Agent "^NetMechanic" bad_bot
SetEnvIfNoCase User-Agent "^CherryPicker" bad_bot
SetEnvIfNoCase User-Agent "^EmailCollector" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^WebBandit" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
SetEnvIfNoCase User-Agent "^ExtractorPro" bad_bot
SetEnvIfNoCase User-Agent "^CopyRightCheck" bad_bot
SetEnvIfNoCase User-Agent "^Crescent" bad_bot
SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^SiteSnagger" bad_bot
SetEnvIfNoCase User-Agent "^ProWebWalker" bad_bot
SetEnvIfNoCase User-Agent "^CheeseBot" bad_bot
SetEnvIfNoCase User-Agent "^Teleport" bad_bot
SetEnvIfNoCase User-Agent "^TeleportPro" bad_bot
SetEnvIfNoCase User-Agent "^MIIxpc" bad_bot
SetEnvIfNoCase User-Agent "^Telesoft" bad_bot
SetEnvIfNoCase User-Agent "^Website Quester" bad_bot
SetEnvIfNoCase User-Agent "^WebZip" bad_bot
SetEnvIfNoCase User-Agent "^moget/2.1" bad_bot
SetEnvIfNoCase User-Agent "^WebZip/4.0" bad_bot
SetEnvIfNoCase User-Agent "^WebSauger" bad_bot
SetEnvIfNoCase User-Agent "^WebCopier" bad_bot
SetEnvIfNoCase User-Agent "^NetAnts" bad_bot
SetEnvIfNoCase User-Agent "^Mister PiX" bad_bot
SetEnvIfNoCase User-Agent "^WebAuto" bad_bot
SetEnvIfNoCase User-Agent "^TheNomad" bad_bot
SetEnvIfNoCase User-Agent "^WWW-Collector-E" bad_bot
SetEnvIfNoCase User-Agent "^RMA" bad_bot
SetEnvIfNoCase User-Agent "^libWeb/clsHTTP" bad_bot
SetEnvIfNoCase User-Agent "^asterias" bad_bot
SetEnvIfNoCase User-Agent "^httplib" bad_bot
SetEnvIfNoCase User-Agent "^turingos" bad_bot
SetEnvIfNoCase User-Agent "^spanner" bad_bot
SetEnvIfNoCase User-Agent "^InfoNaviRobot" bad_bot
SetEnvIfNoCase User-Agent "^Harvest/1.5" bad_bot
SetEnvIfNoCase User-Agent "^Bullseye/1.0" bad_bot
SetEnvIfNoCase User-Agent "^Mozilla/4.0 (compatible; BullsEye; Windows 95)" bad_bot
SetEnvIfNoCase User-Agent "^Crescent Internet ToolPak HTTP OLE Control v.1.0" bad_bot
SetEnvIfNoCase User-Agent "^CherryPickerSE/1.0" bad_bot
SetEnvIfNoCase User-Agent "^CherryPicker /1.0" bad_bot
SetEnvIfNoCase User-Agent "^WebBandit/3.50" bad_bot
SetEnvIfNoCase User-Agent "^NICErsPRO" bad_bot
SetEnvIfNoCase User-Agent "^Microsoft URL Control - 5.01.4511" bad_bot
SetEnvIfNoCase User-Agent "^DittoSpyder" bad_bot
SetEnvIfNoCase User-Agent "^Foobot" bad_bot
SetEnvIfNoCase User-Agent "^WebmasterWorldForumBot" bad_bot
SetEnvIfNoCase User-Agent "^SpankBot" bad_bot
SetEnvIfNoCase User-Agent "^BotALot" bad_bot
SetEnvIfNoCase User-Agent "^lwp-trivial/1.34" bad_bot
SetEnvIfNoCase User-Agent "^lwp-trivial" bad_bot
SetEnvIfNoCase User-Agent "^Wget/1.6" bad_bot
SetEnvIfNoCase User-Agent "^BunnySlippers" bad_bot
SetEnvIfNoCase User-Agent "^Microsoft URL Control - 6.00.8169" bad_bot
SetEnvIfNoCase User-Agent "^URLy Warning" bad_bot
SetEnvIfNoCase User-Agent "^Wget/1.5.3" bad_bot
SetEnvIfNoCase User-Agent "^LinkWalker" bad_bot
SetEnvIfNoCase User-Agent "^cosmos" bad_bot
SetEnvIfNoCase User-Agent "^moget" bad_bot
SetEnvIfNoCase User-Agent "^hloader" bad_bot
SetEnvIfNoCase User-Agent "^humanlinks" bad_bot
SetEnvIfNoCase User-Agent "^LinkextractorPro" bad_bot
SetEnvIfNoCase User-Agent "^Offline Explorer" bad_bot
SetEnvIfNoCase User-Agent "^Mata Hari" bad_bot
SetEnvIfNoCase User-Agent "^LexiBot" bad_bot
SetEnvIfNoCase User-Agent "^Web Image Collector" bad_bot
SetEnvIfNoCase User-Agent "^The Intraformant" bad_bot
SetEnvIfNoCase User-Agent "^True_Robot/1.0" bad_bot
SetEnvIfNoCase User-Agent "^True_Robot" bad_bot
SetEnvIfNoCase User-Agent "^BlowFish/1.0" bad_bot
SetEnvIfNoCase User-Agent "^JennyBot" bad_bot
SetEnvIfNoCase User-Agent "^MIIxpc/4.2" bad_bot
SetEnvIfNoCase User-Agent "^BuiltBotTough" bad_bot
SetEnvIfNoCase User-Agent "^ProPowerBot/2.14" bad_bot
SetEnvIfNoCase User-Agent "^BackDoorBot/1.0" bad_bot
SetEnvIfNoCase User-Agent "^toCrawl/UrlDispatcher" bad_bot
SetEnvIfNoCase User-Agent "^WebEnhancer" bad_bot
SetEnvIfNoCase User-Agent "^TightTwatBot" bad_bot
SetEnvIfNoCase User-Agent "^suzuran" bad_bot
SetEnvIfNoCase User-Agent "^VCI WebViewer VCI WebViewer Win32" bad_bot
SetEnvIfNoCase User-Agent "^VCI" bad_bot
SetEnvIfNoCase User-Agent "^Szukacz/1.4" bad_bot
SetEnvIfNoCase User-Agent "^QueryN Metasearch" bad_bot
SetEnvIfNoCase User-Agent "^Openfind data gathere" bad_bot
SetEnvIfNoCase User-Agent "^Openfind" bad_bot
SetEnvIfNoCase User-Agent "^Xenu's Link Sleuth 1.1c" bad_bot
SetEnvIfNoCase User-Agent "^Xenu's" bad_bot
SetEnvIfNoCase User-Agent "^Zeus" bad_bot
SetEnvIfNoCase User-Agent "^RepoMonkey Bait & Tackle/v1.01" bad_bot
SetEnvIfNoCase User-Agent "^RepoMonkey" bad_bot
SetEnvIfNoCase User-Agent "^Zeus 32297 Webster Pro V2.9 Win32" bad_bot
SetEnvIfNoCase User-Agent "^Webster Pro" bad_bot
SetEnvIfNoCase User-Agent "^EroCrawler" bad_bot
SetEnvIfNoCase User-Agent "^LinkScan/8.1a Unix" bad_bot
SetEnvIfNoCase User-Agent "^Keyword Density/0.9" bad_bot
SetEnvIfNoCase User-Agent "^Kenjin Spider" bad_bot
SetEnvIfNoCase User-Agent "^Cegbfeieh" bad_bot

<Limit GET POST>
order allow,deny
allow from all
Deny from env=bad_bot
</Limit>
__________________
:: Rivmedia - vBulletin and Forum services
:: Mammoth Host - UK Small Business Hosting
:: Webmaster Forums - Join our Webmaster community Today
The opinions expressed in forum posts are my own personal opinions and do not represent any companies that i am associated with.
Reply With Quote
  #14  
Old 12-16-2010, 10:29 AM
mongo50 mongo50 is offline
TAZ Rookie
 
Real Name: ???
Join Date: Nov 2010
Posts: 5
mongo50 is on a distinguished road
Default
OK so what I ended up doing was spending a lot of quality time with my error logs. Turned out the bots in question weren't giving a referer agent. Two were overseas and one was amazonaws.com which was sucking an immense amount of bandwidth for no apparent positive benefit for me. Think I've stopped them for the moment...

Last edited by mongo50; 12-17-2010 at 05:41 AM..
Reply With Quote
  #15  
Old 12-16-2010, 12:09 PM
BirdOPrey5's Avatar
BirdOPrey5 BirdOPrey5 is offline
#Awesome
 
Real Name: Joe
Join Date: Aug 2008
Admin Experience: Advanced
Location: New York
Posts: 837
BirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to behold
Default
I haven't implemented this myself yet, but this is like the ultimate security for stopping malicious bots and hack attempts:

http://www.spambotsecurity.com/zbblock.php
Reply With Quote
  #16  
Old 12-17-2010, 04:59 AM
mongo50 mongo50 is offline
TAZ Rookie
 
Real Name: ???
Join Date: Nov 2010
Posts: 5
mongo50 is on a distinguished road
Default
Very interesting, BirdOPrey5. Just did a quick cruise through http://www.spambotsecurity.com/zbblock.php and the only drawback I see is that it doesn't do anything to protect non-php files. Is there anything else like this out there? Thanks again...
Reply With Quote
  #17  
Old 12-17-2010, 02:24 PM
BirdOPrey5's Avatar
BirdOPrey5 BirdOPrey5 is offline
#Awesome
 
Real Name: Joe
Join Date: Aug 2008
Admin Experience: Advanced
Location: New York
Posts: 837
BirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to beholdBirdOPrey5 is a splendid one to behold
Default
I ended up experimenting with zbblock today... It's default setup is very strict IMO and I find it will block the occasional legitimate user so I personally ended up disabling it again for now while I try to edit the settings more to my liking. For example on the zbblock forums there is a thread on how to allow AOL Proxies which are blocked by default. Also I had blocked people following links to my site because the referring page had what zbblock considered "spam" words in the URL...

It is a great system but the out-of-the-box settings are too strict IMO... Also it kills Tapatalk so I had to remove it for now.

As for not protecting non-php pages- There isn't much that can be done to exploit non php pages so I don't really think it's that big of a concern- assuming you're not running asp or some other language.
Reply With Quote
  #18  
Old 12-21-2010, 10:11 AM
Raymond's Avatar
Raymond Raymond is offline
Tazmanian
 
Real Name: Raymond Arsenault Jr.
Join Date: Apr 2010
Admin Experience: Guru
Location: Boston,MA
Posts: 174
Raymond has a spectacular aura about
Default
Just a heads up, some bots change their IP every visit, which cannot be blocked unless you block the actual website sending them. I never recommend IP banning people as someone could change their IP and someone else can obtain it.
__________________
Part of Management of Pro Call Of Duty Team Obey
Official Website | Twitter | FaceBook Fan Page
Reply With Quote
  #19  
Old 01-29-2011, 03:32 PM
Anonymous's Avatar
Anonymous Anonymous is offline
Shhh...
 
Join Date: Apr 2004
Posts: 960
Anonymous has a spectacular aura about
Default
I've had a tremendous amount of success utilizing ZBBlock on my IPB 2.3 forum and PHP based website. In addition to stopping spam bot registrations, it's protecting my site from other malicious bots like hackbots poking around for exploits and site content scrapers. My first 24 hours using it saw over 1000 malicious site requests blocked, and they've been declining ever since.

Like BirdOPrey5 said, the default blocks may be a bit too harsh if you have international users. I created custom signature bypasses for some lousy ISP's in Italy, Poland, Brazil, Thailand, and Columbia that foster spammers, because I noticed an occasional legitimate user was blocked.

It's very easy to install and is compatible with virtually everything. It works 'out of the box' but it's not exactly intended to. You will want to monitor your block log files for legitimate traffic to make sure everything is going through. After a few days of monitoring my logs and making a few signature customizations, I'm confident that only malicious traffic is being blocked.

It checks that URL queries against your website are not malicious or exploitative, looking for things like sql injection attempts, keyword spamming, directory traversal, and shuts them down before your own PHP scripts have even been initialized. It also blocks known bad ISPs, IP ranges, hostnames, spiders, and user agents. It additionally checks visitor IP addresses against the stopforumspam.com database on critical pages like registration and login. I went from 50+ bogus users a day to manageable 3 or 4 (I now manually approve new accounts, since spambots started hammering forums a few weeks ago).

It saves on bandwidth and resources because it runs before any of your site's content can be loaded. I'm currently using it on every page, serving over 100,000 page views a day, with no slowdown in page load times. From one forum owner to another, I highly recommend giving it a try on your website. It takes a little investment in time to get it set up just right, but it's worth it. The developer is also very active on their support forums, he's been able to answer any question I've had in creating custom bypasses when needed.
Reply With Quote
  #20  
Old 12-21-2012, 10:46 PM
zylstra zylstra is offline
TAZ Rookie
 
Join Date: Feb 2006
Location: Hermosa Beach, CA
Posts: 7
zylstra is on a distinguished road
Default
Haha! I just tried going to the ZBBlock link but got an error from ZBBlock blocking me.
Reply With Quote
Reply

  Admin Zone Forums > The Community Zone > Managing an Online Community > Site Security





Currently Active Users Viewing this Thread: 1 (0 members and 1 guests)
 
Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Want this unknown mod or hook amazing2008 IPB 8 08-31-2009 12:26 AM
Domain Unknown Star08 Getting Started 2 11-13-2008 05:22 PM
robots.txt - Controlling how spiders access and index your website minstrel Forum SEO 15 03-13-2007 10:32 AM
uniques-unknown kev56 Introductions and Welcomes 8 09-08-2006 03:36 PM
AdSense Crawling NexopiaRob Generating Revenue 2 02-16-2005 01:28 PM


 

All times are GMT -4. The time now is 04:55 PM.


Powered by: vBulletin
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Page generated in 0.11763811 seconds with 13 queries
The Admin Zone © copyright 2003-2014 All Rights Reserved. Content published on The Admin Zone requires permission for reprint.