Calculating Render Times
Posted: Thu Mar 11, 2004 6:13 am
I've an interesting conundrum, where logically in your script, for maximum accuracy, should a timer be started and stopped. Atm for my backend system i have it starting just after the main interface class is created, and stopped just after the page is displayed (print() to the browser). I have a problem tho, since the page is displayed and then the timer is stopped, showing the render time will have it spat out after the page code.I really dont think that's valid
You might suggest that i just dont print() it stright away, do a replace of a RENDER tag, then print, but therein lies my conundrum. So, my question, what is the difference between these two:
interface class creation->timer started-> ...... ->display page(visible in browser)->stop timer
interface class creation->timer started-> ...... ->last action before actually giving the browser its output stop timer->display page(visible in browser)
The later option would allow me to nicely display the render time in the browser, all valid. I did try it out, and from what i could tell theres not a huge difference. ~0.17 vs ~0.13. *shrugs* which is a more accurate way to do it you think? newho...just wondering. What do you guys logically do.
-PB
interface class creation->timer started-> ...... ->display page(visible in browser)->stop timer
interface class creation->timer started-> ...... ->last action before actually giving the browser its output stop timer->display page(visible in browser)
The later option would allow me to nicely display the render time in the browser, all valid. I did try it out, and from what i could tell theres not a huge difference. ~0.17 vs ~0.13. *shrugs* which is a more accurate way to do it you think? newho...just wondering. What do you guys logically do.
-PB