I have created a web based sub system which on regular basis (daily, weekly event monthly) have to do update all the customer data from another system.
The Big Issue, the raw data from another system are very large (last week i have 90MB of text file raw data).
Currently the method what i have is upload the text file to the server, and after that using php script read line by line and update the records to my MSSQL database.
Because the file size is too large, its take a lot of time to upload the data to the server and after the the data uploaded i have to read the text file and execute the updating records. Because the process takes a lot of times, usually the client browsers did not responding to the process which is still running in the server..
Another solution what i have try is compressed the text file, extract from the server and begin the record updating. And the result is still the same, usually the client browser did not responding to the process who still running in the server.
I have enlarge the maximum execution time in php.ini but still not solved my problem.
Are there any solution to my big problem like this ?
I cannot accessed the data directly from another system, so i use the text file as a solution for me..
Some times i have idea to create a job like cron in Unix (background job), but i'm using MS Win for database and web server.