PHP programming forum. Ask questions or help people concerning PHP code. Don't understand a function? Need help implementing a class? Don't understand a class? Here is where to ask. Remember to do your homework!
and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read: [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]
Hi All,
I've got what should be a simple question, but I've not been able to finde the answer so far...
Here is what I am trying to do, I need to read several (large) tables from one database and import them into another database. I'd like to set it up, so once a table has been uploaded, the page automatically goes to another php file and prints something out like "processing table 2" etc, etc.
I've tried redirecting one page to the next, etc, etc, but the browser gets no output at all.
the code is something like this:
copydata.php
and [syntax="..."] tags where appropriate when posting code. Your post has been edited to reflect how we'd like it posted. Please read: [url=http://forums.devnetwork.net/viewtopic.php?t=21171]Posting Code in the Forums[/url] to learn how to do it too.[/color]
When it has finished, move it to a page with a little javascript in that 'physically' opens a new page, otherwise it's the equivalent of having it all on one page.
META REFRESH tags are good and all... but the problem is the table import takes different amounts of time for different tables, some take a couple of seconds and others take several minutes.
Yes, that does indeed work, this solved one problem... but made another one appear.
See, the reason why I was splitting the DB upload process is because the script was talking too long and I was getting the "Fatal error: Maximum execution time of 240 seconds exceeded in C:\wwwroot\ARCO\dbfunctions.php on line 69" error. (as you see I already upped the time from 30 (default) to 240 seconds, but the table is rather big (about 30K records) and it "only" loads about 12K before it times out.
I thought splitting the process up, would help since there are many small tables and just a few big ones... but the BIG BIG ones, wont play ball...
Is there a way to do an asynchronous call or something to that effect that would allow to check on the status of things and wait till completion?