How do you benchmark PHP scripts?

XML, Perl, Python, and other languages can be discussed here, even if it isn't PHP (We might forgive you).

Moderator: General Moderators

Post Reply
User avatar
Heavy
Forum Contributor
Posts: 478
Joined: Sun Sep 22, 2002 7:36 am
Location: Viksjöfors, Hälsingland, Sweden
Contact:

How do you benchmark PHP scripts?

Post by Heavy »

I use the following in a script to measure time consumed by a specific procedure

Code: Select all

<?php
$time[] = Array('Start',getmicrotime());

function getmicrotime(){ 
	list($usec, $sec) = explode(" ",microtime()); 
    return ((float)$usec + (float)$sec); 
} 

//blablabla time consuming operations
$time[] = Array('End',getmicrotime());
echo "<br>Total time = ".number_format($time[count($time)-1][1] - $time[0][1],3,","," ")." s";
?>
EDIT: Maybe you wonder why I used Arrays? Don't bother, and it is off topic aswell.
This works sufficiently as long as I am only interested in the execution time of the script.
But what about overhead, network time, parsing time, compilation time, transfer time and such.

How do you measure such things?

I am using Debian Linux 3.0
User avatar
nielsene
DevNet Resident
Posts: 1834
Joined: Fri Aug 16, 2002 8:57 am
Location: Watertown, MA

Post by nielsene »

Well parsing/compiling time is included, if you have your call to $time as the first command, before all your includes.

Transfer/network time can not be measured from within the languange. The best thing I can think of would be to write a C program that will exec/fork lots of wget's or something and time that?
User avatar
Heavy
Forum Contributor
Posts: 478
Joined: Sun Sep 22, 2002 7:36 am
Location: Viksjöfors, Hälsingland, Sweden
Contact:

Post by Heavy »

nielsene wrote:Well parsing/compiling time is included, if you have your call to $time as the first command, before all your includes.
Is it? I thought the Zend engine read all of the file before starting to do things.
nielsene wrote:Transfer/network time can not be measured from within the languange.
Yes, of course.
nielsene wrote:The best thing I can think of would be to write a C program that will exec/fork lots of wget's or something and time that?
Ok, so when I read about people testing database overhead, transfer rates and web server performance, they have usually written some kind of software sensor around the connection?
User avatar
nielsene
DevNet Resident
Posts: 1834
Joined: Fri Aug 16, 2002 8:57 am
Location: Watertown, MA

Post by nielsene »

Well when i've put timestamps around by how list of includes I get a benchmark that is very different than between two adjacent timestamps calls, so I'm assuming some amount of work is being done.

Yes a very lightweight, compile wrapper is normally used.
User avatar
Heavy
Forum Contributor
Posts: 478
Joined: Sun Sep 22, 2002 7:36 am
Location: Viksjöfors, Hälsingland, Sweden
Contact:

Post by Heavy »

nielsene wrote:Well when i've put timestamps around by how list of includes I get a benchmark that is very different than between two adjacent timestamps calls, so I'm assuming some amount of work is being done.
Hmmm.... Yes, I agree with that, but not with your statement that it can measure the parse - compile time of the script itself. That is how I understood your first post.

The script can't be running before it is completely checked for for parse errors, can it?
Sounds crazy to me. :roll:
User avatar
nielsene
DevNet Resident
Posts: 1834
Joined: Fri Aug 16, 2002 8:57 am
Location: Watertown, MA

Post by nielsene »

Well interpreted languanges check for parse errors one line at a time at run time. I don't know if PHP has added a pre-check, I thought it didn't, maybe it does.

If it does and you want to benchmark it, you can do the network test hitting a static html page and varying length PHP pages to measure the php launch overhead and quantify the parse time before after subtracting out the time reported by php for the script execution.
User avatar
Heavy
Forum Contributor
Posts: 478
Joined: Sun Sep 22, 2002 7:36 am
Location: Viksjöfors, Hälsingland, Sweden
Contact:

Post by Heavy »

Hmmm... Ok.

Thanks.
User avatar
volka
DevNet Evangelist
Posts: 8391
Joined: Tue May 07, 2002 9:48 am
Location: Berlin, ger

Post by volka »

you can guess from error/warning messages what happens when.
If there is a parse error in your main script you will get this before anything is performed

Code: Select all

<?php
echo 'output';
$string = . "where's the first operand?";
?>
other things are done at run-time, e.g. try

Code: Select all

<?php
function func($a, $b)
{
	echo 'func';
}

echo 'output 1';
func('only one parameter');
echo 'output 2';
?>
If your main script includes other scripts they're parsed when include(...) has been hit.
User avatar
Heavy
Forum Contributor
Posts: 478
Joined: Sun Sep 22, 2002 7:36 am
Location: Viksjöfors, Hälsingland, Sweden
Contact:

Post by Heavy »

volka wrote:you can guess from error/warning messages what happens when.
If there is a parse error in your main script you will get this before anything is performed

Code: Select all

<?php
echo 'output';
$string = . "where's the first operand?";
?>
the output was:

Code: Select all

&lt;br /&gt;
&lt;b&gt;Parse error&lt;/b&gt;:  parse error, unexpected '.' in &lt;b&gt;/var/www/objmodel/idx.php&lt;/b&gt; on line &lt;b&gt;3&lt;/b&gt;&lt;br /&gt;
Appearently, correct syntax seems to be checked for first. This seems to me beeing overhead that cannot be measured from within the running script.

I am not using an accelerator. For those who don't know what that is, I provide these links: Zend Accelerator and PHP Accelerator.

Code: Select all

<?php
function func($a, $b)
{
   echo 'func';
}

echo 'output 1';
func('only one parameter');
echo 'output 2';
?>
Output was:

Code: Select all

output 1&lt;br /&gt;
&lt;b&gt;Warning&lt;/b&gt;:  Missing argument 2 for func() in &lt;b&gt;/var/www/objmodel/idx.php&lt;/b&gt; on line &lt;b&gt;2&lt;/b&gt;&lt;br /&gt;
funcoutput 2
The script continues without errors, but generates a warning.
Right now I don't know what to say or ask so I guess I'll just stop typing... :?
Post Reply