Page 2 of 2

Posted: Tue Sep 27, 2005 1:48 pm
by Roja
petterg wrote: Well, I kind of am the provider (as I'm the root user on the system). I need some help to prevent it!
Open_basedir and Safe mode are the two best bets, short of restructuring to use virtual hosting.

I've used both in various configurations, so if you'd like some advice, feel free to start a topic on setting either up.

Posted: Wed Sep 28, 2005 8:50 am
by petterg
feyd wrote:Give them a blackbox class that does a socket connection to a second instance of the web server, that instance would then access to the actual data files (run under a different user). The live site can use the normal version of the blackbox script.
To run two instances of the webserver under different users might be a very good idea. Do you know about any good howto's for this?

Posted: Wed Sep 28, 2005 9:06 am
by petterg
Roja wrote:
petterg wrote: Well, I kind of am the provider (as I'm the root user on the system). I need some help to prevent it!
Open_basedir and Safe mode are the two best bets, short of restructuring to use virtual hosting.
Open_basedir might also be useful. I wasn't aware of it until now.
php_doc wrote:In httpd.conf, open_basedir can be turned off (e.g. for some virtual hosts) the same way as any other configuration directive with "php_admin_value open_basedir none".
Is there any way to invert the function of Open_basedir? In the sence of allow opening of all files EXCEPT those in the listed directories. That means I could set up a virtual host for my files, and block other domains from reaching my files.
Maybe I could combine Open_basedir with two instances of apache just be be sure....

Posted: Mon Oct 03, 2005 9:39 pm
by shiflett
petterg wrote: I have the feeling there is a need to modify the php parser to make this work. Modify it so that ownership of files used by include and fopen has to be readable to the owner of the running script, or the include / fopen fuctions should fail. As it is today it's only required to be readable to the apache user.
This is what safe_mode does, but be careful. It is architectually incorrect to try to solve this with PHP, and that's exactly what the manual states. I could write a Perl script that reads your PHP files, and my Perl script doesn't care what your php.ini says. :-)

If PHP is the only thing supported, then safe_mode offers more protection, but I can still simply create a file with PHP, and that file will be owned by Apache (therefore, that file can contain PHP code, and that code avoids the safe_mode restrictions for any file owned by Apache). Given your recent line of questioning, I don't understand why my original suggestion isn't sufficient. It's a much stronger technique.

Posted: Tue Oct 04, 2005 2:25 am
by timvw
currently we have the following setup: linux with grsec patches

php with safe_mode=on
safe_mode_include_dir = /usr/share/pear:/usr/share/php

userdirectoriers are created with ownership to user:services
users create files as user:users

If you chmod userdirectory to 700 only the user can get in
Next you need to allow services (apache is running as www-data and in group services) to the homedirectory:
chmod g+x ~
And give access to the directory with website stuff
chmod o+x ~/website
And give read access to the credentials file
chmod o+r ~/secrets.php

Now, other users can't get into your homedirectory because only the user and groupmembers of services are allowed. And only others, thus noone in the users group, can read the secrets.php file..

Thus, if you allow apache to only execute php scripts (with safe_mode) you can effectively prevent reading your secrets files...


We need webservers that can run as the owner of the files :) Or run as a different owner per virtual host.. I think this was announced as a feature for apache2 a couple of years ago but it seems to have become very silent on that front...

Posted: Tue Oct 04, 2005 6:40 pm
by petterg
timvw
Either I’m misunderstanding what you’re writing, or you’re not considering the kind of scripting (php and/or perl) that Shiflett writes in the post above your.
We need webservers that can run as the owner of the files. Or run as a different owner per virtual host..
Is that possible? If so it’s a quite useful feature.

Posted: Tue Oct 04, 2005 6:46 pm
by petterg
shiflett wrote:I don't understand why my original suggestion isn't sufficient. It's a much stronger technique.
Isn't you original suggestion just as woundable as this?

Posted: Tue Oct 04, 2005 8:18 pm
by josh
Why couldn't you just have all the employess do their reading/writing of files through a web panel, and your php script could block out access to any of the blacklisted files.

They wouldn't have FTP but you could allow them to edit files through this 'web admin panel'

Posted: Wed Oct 05, 2005 12:38 pm
by timvw
petterg wrote:timvw
Either I’m misunderstanding what you’re writing, or you’re not considering the kind of scripting (php and/or perl) that Shiflett writes in the post above your.
I have an advantage, because we only have mod_php (no cgi/mod_cgi/mod_perl etc)..
This allows us to keep users in their own directories...

I do like the solution offered by shifflet, but it requires you not to output environment variables (DB_USER etc)..


We need webservers that can run as the owner of the files. Or run as a different owner per virtual host..
Is that possible? If so it’s a quite useful feature.[/quote]

I'm affraid it isn't really possible right now. For sites that really require it i would suggest UML (usermode linux) but don't want to pay for a dedicated server.