Multiple Users Writing To The Same Log File?

Not for 'how-to' coding questions but PHP theory instead, this forum is here for those of us who wish to learn about design aspects of programming with PHP.

Moderator: General Moderators

Post Reply
Vik
Forum Newbie
Posts: 13
Joined: Sun May 23, 2004 1:12 pm
Location: Los Angeles, CA

Multiple Users Writing To The Same Log File?

Post by Vik »

I'm building a PHP app, and I thought it would be good to keep a log of how many times users complete a specific operation on my site. My first thought was, just have a log file, in tab text format, and write a line to it whenever a user completes that action.

But then it occurred to me - what if multiple users are on the system at once, so that the log file is open by one user's session, when another session tries to write to it?

How does PHP handle that? Is there a potential for a session to lock up until all the previous sessions accessing the log file have completed the operation? How do you avoid the appearance to the user that the system has become unresponsive?

Thanks very much in advance to all for any info.
User avatar
Christopher
Site Administrator
Posts: 13596
Joined: Wed Aug 25, 2004 7:54 pm
Location: New York, NY, US

Post by Christopher »

Take a look at the file locking functions in the manual. There are a number of examples of Loggers and Caches that use file locking around.
(#10850)
Vik
Forum Newbie
Posts: 13
Joined: Sun May 23, 2004 1:12 pm
Location: Los Angeles, CA

Post by Vik »

Thanks for the info. So, it looks very straightforward - just lock the file, write to it, and unlock it. Does this mean that in practice users don't see delays from a lot of sessions trying to write to the same file at once? To take an extreme example, if, say, 100 simultaneous sessions tried to write to the same log file, would it look to session #100 as though the system had locked up? (I only have to write <256 bytes to the log file for each logging action.)

In my case, it would not cause data loss to postpone the logging action to run at some later time, asynchronously with the user's system access. Does anyone go to this extreme?
User avatar
Christopher
Site Administrator
Posts: 13596
Joined: Wed Aug 25, 2004 7:54 pm
Location: New York, NY, US

Post by Christopher »

There are certainly well designed logging system that can log under heavy load. Or you might want to think about logging to database because they have locking built in.
(#10850)
User avatar
dbevfat
Forum Contributor
Posts: 126
Joined: Tue Jun 28, 2005 2:47 pm
Location: Ljubljana, Slovenia

Post by dbevfat »

About the locks; yes, locking a file means that file isn't accessible for writing for any other process. Other processes have to wait until the lock is released and even though a single write is fast, this can actually become a bottleneck.

As arborint said, logging to the database is an option, although introduces some other issues, mainly in the performance area. For example, you probably shouldn't use indices, because each insertion writes to the index. This operation is much slower than a single write to file. Also, you have to purge older records if you don't want the database to grow too big. And if you want to select older records, you have to use at least one index on the timestamp to avoid full table scan. With files, this is easily solved with file rotation, so no file gets too big and you can access older records without actually interfering with the current log file. Database is definitely an option and maybe suits you fine, but you have to consider the drawbacks.

On a website I develop, we're using syslog (http://www.php.net/syslog), and it works great (or at least good enough not to have caused any problems so far). The number of requests is approximately 2M/day and the number of logged entries is essentially the same. I am not particularly familiar with how it works, but I believe there is a daemon process running in the background and php is posting the messages to it. So the concurrent-write problem is basically removed; there is only one writer (syslogd).

You can also look at log4php (http://incubator.apache.org/log4php/index.html), which is a port from log4j, a well-known, stable solution for Java logging.

best regards
User avatar
nathanr
Forum Contributor
Posts: 200
Joined: Wed Jun 07, 2006 5:46 pm

Post by nathanr »

maybe worth looking into setting the stream_set_write_buffer - either set it to 0 or to the exact legnth of the data to be written?

spawn a single process that logs to the file and has it's own buffer, the simply send the disered logable's to the ever running process?

database..
User avatar
s.dot
Tranquility In Moderation
Posts: 5001
Joined: Sun Feb 06, 2005 7:18 pm
Location: Indiana

Post by s.dot »

I've used file writing on a fairly high trafficed web site before, using file locking, and it worked good and never any noticeable delays, at least for me.
You'd probably have to have to have 50,000+ people online to get 100 people trying to write to the file in the same fraction of a second :P.

For me it was as simple as

Code: Select all

$handle = fopen('/file.txt');
flock($handle, LOCK_EX);
fwrite($handle, $data);
flock($handle, LOCK_UN);
fclose($handle);
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
Vik
Forum Newbie
Posts: 13
Joined: Sun May 23, 2004 1:12 pm
Location: Los Angeles, CA

Post by Vik »

Thanks very much to all for this great info!
Post Reply