A Script for Seeing Crawlers

Colorado Newbie

Super Genius
243
Kenai, AK
A couple of days ago I found a crawler article about a script called CrawlTrack. Last night I decided to install it because I'd really like to know more about crawlers. If you have access to Awstats you'll see all sorts of hits from crawlers but no name for them.

CrawlTrack tells you who they are, what pages they've crawled, when the last time they came through and how often they come through. It also alerts you to any potential hacking attempts.

Oddly enough the Chinese seem have a great intrest in my website. So far in the last 15 hours Baidu Spider has been through 32 times....
 
Yahoo Slurp bots are real bandwidth hogs.
Taking measures to prevent that is an important aspect of maintaining websites properly, because most of the times this issue goes unnoticed by most webdevelopers/webmasters.

You can filter these crawlers or delay their crawl periods through robots.txt files. you may also restrict access through .htaccess commands.

Val.
 
Back
Top