Page 1 of 1

load big file into database by php

Posted: Sun Mar 04, 2007 1:09 am
by orlandinho
hi

i am doing an app that needs maintenance every month
this maintenance is the creation of a table for the month and the upload of a file with nearly 20 million rows to populate the table

this should be done by a web interface

the problem here is the time it will take to complete this process, there may be a php max execution time error

i was thinking about using async call to leave this process running

maybe someone can help with any solution

PD: i am working with mysql 5.0.27

thanks

Posted: Sun Mar 04, 2007 2:41 am
by s.dot
The following will make sure the php script won't time out

Code: Select all

set_time_limit(0);
ignore_user_abort(true); //just in case someone closes the browser
For the uploading of the file, is it an SQL file?

You could use a form to upload the file to the server, then something like the following to execute it

Code: Select all

exec('mysql -u username -h localhost -p');
exec('use database_name');
exec('\. /path/to/'.$uploaded_file');
I'm not sure how well that would work. But that's my input. ;)

Posted: Sun Mar 04, 2007 2:59 am
by pickle
Is this an SQL file? Could you use phpMyAdmin to import it?

I know any time a PHP script spends waiting around for the database to finish does NOT count towards the 30 second time out (so you can be waiting for hours for MySQL to finish & your script won't time out).

Posted: Sun Mar 04, 2007 8:48 am
by feyd
User raghavan20 has a bunch of threads involving the decomposition of an SQL file into individual statements that can be processed. If memory serves, the meat is in the Regex board.

There's also tools like mysqlimport and bigdump

Posted: Sun Mar 04, 2007 10:55 am
by orlandinho
actually the file is a csv file with 20 million rows aprox
when i load this data by mysql prompt it takes nearly 4 hours

Posted: Sun Mar 04, 2007 11:21 am
by feyd
fgetcsv() may be of interest.

A script will take even longer than the command line.