EDIT: Maybe you wonder why I used Arrays? Don't bother, and it is off topic aswell.
This works sufficiently as long as I am only interested in the execution time of the script.
But what about overhead, network time, parsing time, compilation time, transfer time and such.
Well parsing/compiling time is included, if you have your call to $time as the first command, before all your includes.
Transfer/network time can not be measured from within the languange. The best thing I can think of would be to write a C program that will exec/fork lots of wget's or something and time that?
nielsene wrote:Well parsing/compiling time is included, if you have your call to $time as the first command, before all your includes.
Is it? I thought the Zend engine read all of the file before starting to do things.
nielsene wrote:Transfer/network time can not be measured from within the languange.
Yes, of course.
nielsene wrote:The best thing I can think of would be to write a C program that will exec/fork lots of wget's or something and time that?
Ok, so when I read about people testing database overhead, transfer rates and web server performance, they have usually written some kind of software sensor around the connection?
Well when i've put timestamps around by how list of includes I get a benchmark that is very different than between two adjacent timestamps calls, so I'm assuming some amount of work is being done.
Yes a very lightweight, compile wrapper is normally used.
nielsene wrote:Well when i've put timestamps around by how list of includes I get a benchmark that is very different than between two adjacent timestamps calls, so I'm assuming some amount of work is being done.
Hmmm.... Yes, I agree with that, but not with your statement that it can measure the parse - compile time of the script itself. That is how I understood your first post.
The script can't be running before it is completely checked for for parse errors, can it?
Sounds crazy to me.
Well interpreted languanges check for parse errors one line at a time at run time. I don't know if PHP has added a pre-check, I thought it didn't, maybe it does.
If it does and you want to benchmark it, you can do the network test hitting a static html page and varying length PHP pages to measure the php launch overhead and quantify the parse time before after subtracting out the time reported by php for the script execution.
you can guess from error/warning messages what happens when.
If there is a parse error in your main script you will get this before anything is performed
volka wrote:you can guess from error/warning messages what happens when.
If there is a parse error in your main script you will get this before anything is performed
<br />
<b>Parse error</b>: parse error, unexpected '.' in <b>/var/www/objmodel/idx.php</b> on line <b>3</b><br />
Appearently, correct syntax seems to be checked for first. This seems to me beeing overhead that cannot be measured from within the running script.
I am not using an accelerator. For those who don't know what that is, I provide these links: Zend Accelerator and PHP Accelerator.
output 1<br />
<b>Warning</b>: Missing argument 2 for func() in <b>/var/www/objmodel/idx.php</b> on line <b>2</b><br />
funcoutput 2
The script continues without errors, but generates a warning.
Right now I don't know what to say or ask so I guess I'll just stop typing...