Restrict spiders to low-demand times?

Ye' old general discussion board. Basically, for everything that isn't covered elsewhere. Come here to shoot the breeze, shoot your mouth off, or whatever suits your fancy.
This forum is not for asking programming related questions.

Moderator: General Moderators

Post Reply
intellivision
Forum Commoner
Posts: 83
Joined: Mon Aug 22, 2005 1:25 am
Location: Orbit

Restrict spiders to low-demand times?

Post by intellivision »

Is there a syntax to place in robots.txt to restrict (ask, prod) spiders to visit at certain hours?

I know there's a page request frequency syntax, but certain hours would work better for my server. It's getting kinda taxed.
User avatar
Peter Anselmo
Forum Commoner
Posts: 58
Joined: Wed Feb 27, 2008 7:22 pm

Re: Restrict spiders to low-demand times?

Post by Peter Anselmo »

Yes, there is, here's an example robots.txt file:

Code: Select all

User-agent: *
Disallow: 
Request-rate: 1/5         # maximum rate is one page every 5 seconds
Visit-time: 0600-0845     # only visit between 06:00 and 08:45 UTC (GMT)
 
Wikipedia also has lots of detailed info here

Keep in mind the robots.txt is just a suggestion to compliant crawlers, spam bots won't check or care.
Post Reply