Page 1 of 1

Huge Data buffer

Posted: Sun Jun 15, 2003 6:29 pm
by ash_par
Hi,
I am working on a simple application that retrieves data from a flat database and converts it into graphs using jpgraph utility.

My problem is that when I specify a bigger data range for my application php, times out. I have changed my time out setting, in php.ini, to 60 from 30 and have also allocated 32M for the application instead of 8M.

But the application still does not process the data.

To give an idea of the data I am handling:

Under worst case scenario I could be handling - 365 files with 150 lines each.

Initially, I thought that php was slow, but I get all the data assembeled in really fast, its the jpgraph utility that takes lot of time!

I am passing arrays to my graphing functions, but the arrays are passed by reference.

I am working with 123M machine right now. Please suggest an optimum memory size too, as I am planning to upgrade the machine.

Thanks in advance

jpgraph

Posted: Tue Jun 24, 2003 11:29 am
by barnbuster
I don't know if this is any help, however I have had similar experience of this. I work remotely on a linux server from my desktop and have a huge number of data arrays outputting into one graph....successfully.

However, I have also Apache and PHP configured to run on my desktop so I can work off line, this has a Win 32 sys, and it times out running the same script. Both the remote server and desktop have almost identical php.ini configurations, I haven't checked the Apache configs too thoroughly however, but I've put it down to an a failing of my desktop Win32 sys.

A question for you reqarding jpgraph:
Is it possible to output multiple graphs via a loop? The headers already sent issue appears to stop you being able to do this.