Page 1 of 1
concurrent access to log file
Posted: Wed Apr 14, 2004 5:33 am
by stefanx
Hi,
I want to use some logging to log e.g. user errors and other problems occuring during my php script execution. There are numerus code examples already available. But I wonder: If I have some level of user traffic and a few events happen at the same time, what happens to my log file / events? Will I lose some events? Will I get errors? Do I have to take care of this case, by using some kind of queue?
Thanks!
-Stefan

Posted: Wed Apr 14, 2004 8:13 am
by liljester
in a computer "a few things" cannot happen at the same time =) you should be fine if you stick everything in a log file
Posted: Wed Apr 14, 2004 9:09 am
by stefanx
liljester,
of course, but: wouldn't this mean the log file writing is 'atomic'? I mean one instance would open the log file and start to write. What about another instance? They will continue to work as soon as they get cpu time. Or are the write accesses somehow buffered by PHP?
Br
-Stefan
Posted: Wed Apr 14, 2004 11:10 am
by TheBentinel.com
In my experience, when you get a few concurrent users pounding on a logfile, you start getting messages mixed together. You are likely to lose/garble messages.
If it's important that you not lose messages in this way, write each message to a unique file, then merge those files at the end of the day.
Or better still, keep your log in the database. Then each message has its own record.
Posted: Thu Apr 15, 2004 5:16 pm
by Weirdan
you can use flock function to ensure that only one instance of you script is writing to log at any given time. syslog facility might be of help too
Posted: Thu Jun 23, 2005 4:13 pm
by OldKat
When you flock() a file, what happens with all the other simultaneous file access requests? Are they all paused until the file is freed? And then, are they handled in order of access request?
Thanks in advance.