Page 1 of 1

Securing Documents

Posted: Fri Jul 22, 2005 4:48 pm
by TheOracle
Hi All,

I have a spec for a website which will behandling the upload and retrieval of PDF files from a website, only vailable to members (or customers) who have paid for the right to view them.

However, what is to stop:

1. The documents being indexed by a search engine (is there a way to do this dynamically in the robots.txt file?
2. People accessing them directly via the URL?

This is very important that I can provide an answer.

I hope you can help.

Many thanks

Posted: Fri Jul 22, 2005 6:27 pm
by Roja
Don't put them in the webroot.

Lets say you have a hosting account at /whatever. Your website is at /whatever/public_html. Simply make a subdirectory of whatever that is *not* a subdirectory of public_html. For example: /whatever/secure.

Then secure that directory - only give apache access to it, and add an htaccess rule that will deny all. (becuase your php script will be accessing it via the filesystem, NOT via http calls).

Then use php to save documents to that area, and to load and send them (use header, output buffering, and a little file magic).

That way, people can't access them directly (there is no weblink to them), and search engines can't either.

Posted: Sat Jul 23, 2005 2:42 am
by TheOracle
Thanks Roja,

Would the same apply for IIS as for Apache?

I create a directory one above the root and give only the web user access. Then using PHP I can use move_uploaded_file() to move the file to that directory, and then just retrieve it from the root directory using:

../secure/filename.pdf

Posted: Sat Jul 23, 2005 2:19 pm
by Roja
TheOracle wrote:Would the same apply for IIS as for Apache?
Everything except for the .htaccess file. Thankfully, thats just a backup security measure, so yes - same deal.