I'm trying to find a way to limit the amount of pages a user can access. I use a shared server so I only have access to .htaccess ; I also have PHP parse all my files.
My site has about 300 pages but sometimes my logs show "users" which look at 1000, I guess these are robots. How can I limit this to say: 100 ?
Does MaxKeepAliveRequests have anything to do with this ? Does this directive have any results when I parse all my files with PHP ? I also don't want Google to be negatively affected, so can this be achieved excluding some user-agents ?
Limit max amount of pages a user can access
Moderator: General Moderators
- Ambush Commander
- DevNet Master
- Posts: 3698
- Joined: Mon Oct 25, 2004 9:29 pm
- Location: New Jersey, US
Re: Limit max amount of pages a user can access
MaxKeepAliveRequests doesn't have anything to do with this. Since your website is open to the general public, there's no way to foolproof limit the amount of pages a user can access. If you're not running into CPU/Bandwidth problems, I'd recommend you just not worry about it... the bots can't hurt.
- Christopher
- Site Administrator
- Posts: 13596
- Joined: Wed Aug 25, 2004 7:54 pm
- Location: New York, NY, US
Re: Limit max amount of pages a user can access
You could implement something using a session variable that counted page views and did not show pages after a certain count. Include something like this in every page (or in a Front Controller).
Code: Select all
<?php
session_start();
if (! isset($_SESSION['pagecount'])) {
$_SESSION['pagecount'] = 0;
}
if ($_SESSION['pagecount'] > 100) {
exit();
} else {
++$_SESSION['pagecount'];
}
(#10850)