Hi All,
I have a spec for a website which will behandling the upload and retrieval of PDF files from a website, only vailable to members (or customers) who have paid for the right to view them.
However, what is to stop:
1. The documents being indexed by a search engine (is there a way to do this dynamically in the robots.txt file?
2. People accessing them directly via the URL?
This is very important that I can provide an answer.
I hope you can help.
Many thanks
Securing Documents
Moderator: General Moderators
Don't put them in the webroot.
Lets say you have a hosting account at /whatever. Your website is at /whatever/public_html. Simply make a subdirectory of whatever that is *not* a subdirectory of public_html. For example: /whatever/secure.
Then secure that directory - only give apache access to it, and add an htaccess rule that will deny all. (becuase your php script will be accessing it via the filesystem, NOT via http calls).
Then use php to save documents to that area, and to load and send them (use header, output buffering, and a little file magic).
That way, people can't access them directly (there is no weblink to them), and search engines can't either.
Lets say you have a hosting account at /whatever. Your website is at /whatever/public_html. Simply make a subdirectory of whatever that is *not* a subdirectory of public_html. For example: /whatever/secure.
Then secure that directory - only give apache access to it, and add an htaccess rule that will deny all. (becuase your php script will be accessing it via the filesystem, NOT via http calls).
Then use php to save documents to that area, and to load and send them (use header, output buffering, and a little file magic).
That way, people can't access them directly (there is no weblink to them), and search engines can't either.