Page 1 of 1

Calculating Render Times

Posted: Thu Mar 11, 2004 6:13 am
by Pointybeard
I've an interesting conundrum, where logically in your script, for maximum accuracy, should a timer be started and stopped. Atm for my backend system i have it starting just after the main interface class is created, and stopped just after the page is displayed (print() to the browser). I have a problem tho, since the page is displayed and then the timer is stopped, showing the render time will have it spat out after the page code.I really dont think that's valid :roll: You might suggest that i just dont print() it stright away, do a replace of a RENDER tag, then print, but therein lies my conundrum. So, my question, what is the difference between these two:

interface class creation->timer started-> ...... ->display page(visible in browser)->stop timer

interface class creation->timer started-> ...... ->last action before actually giving the browser its output stop timer->display page(visible in browser)

The later option would allow me to nicely display the render time in the browser, all valid. I did try it out, and from what i could tell theres not a huge difference. ~0.17 vs ~0.13. *shrugs* which is a more accurate way to do it you think? newho...just wondering. What do you guys logically do.

-PB

Posted: Fri Mar 12, 2004 1:50 pm
by Selkirk
for maximum accuracy, forget about in page timers altogether. Use ab (apache bench) or something equivalent to time your page. This times it from the users perspective, the only one that counts, and immune to the distortions of in page timers.

Posted: Fri Mar 12, 2004 5:33 pm
by penguinboy