Is there a syntax to place in robots.txt to restrict (ask, prod) spiders to visit at certain hours?
I know there's a page request frequency syntax, but certain hours would work better for my server. It's getting kinda taxed.
Restrict spiders to low-demand times?
Moderator: General Moderators
-
intellivision
- Forum Commoner
- Posts: 83
- Joined: Mon Aug 22, 2005 1:25 am
- Location: Orbit
- Peter Anselmo
- Forum Commoner
- Posts: 58
- Joined: Wed Feb 27, 2008 7:22 pm
Re: Restrict spiders to low-demand times?
Yes, there is, here's an example robots.txt file:
Wikipedia also has lots of detailed info here
Keep in mind the robots.txt is just a suggestion to compliant crawlers, spam bots won't check or care.
Code: Select all
User-agent: *
Disallow:
Request-rate: 1/5 # maximum rate is one page every 5 seconds
Visit-time: 0600-0845 # only visit between 06:00 and 08:45 UTC (GMT)
Keep in mind the robots.txt is just a suggestion to compliant crawlers, spam bots won't check or care.