Loading really large files
Moderator: General Moderators
Loading really large files
I have a csv file that is over 5MB and I need to parse it and convert it to an array so I can check it against a database. PHP file_get_contents() just fine, but when I go to explode() it chokes and says Allowed memory size of 8388608 bytes exhausted. How should I go about this? I've never worked with a file this large before.
- Kieran Huggins
- DevNet Master
- Posts: 3635
- Joined: Wed Dec 06, 2006 4:14 pm
- Location: Toronto, Canada
- Contact:
You could increase the memory limit in php.ini
if that's not an option and you have 5.1.0 or greater installed you can use the optional arguments (offset & maxlen):
if that's not an option and you have 5.1.0 or greater installed you can use the optional arguments (offset & maxlen):
to load the file in segments.http://www.php.net/manual/en/function.file-get-contents.php wrote:file_get_contents ( string filename [, bool use_include_path [, resource context [, int offset [, int maxlen]]]] )
- Kieran Huggins
- DevNet Master
- Posts: 3635
- Joined: Wed Dec 06, 2006 4:14 pm
- Location: Toronto, Canada
- Contact:
I've successfully used fgetcsv() to import a postal code database into mysql. The file was about 600 MB. 
Set Search Time - A google chrome extension. When you search only results from the past year (or set time period) are displayed. Helps tremendously when using new technologies to avoid outdated results.
- Kieran Huggins
- DevNet Master
- Posts: 3635
- Joined: Wed Dec 06, 2006 4:14 pm
- Location: Toronto, Canada
- Contact:
OK, let me explain why I need this functionality and see if anybody has any better ideas. The fgetcsv function works great except now I am trying to load in two other files (that aren't quite as large, but still big). These three files are exports from vendor databases. I need to take each csv file, and add up the total amount in stock of each sku (which is a field in each csv file) and then update a mysql database with the results. Anybody know what the best way to do this would be? Just about everything I've done has resulted in:

I'm on a shared server, so my options are limited.Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate xxx bytes)
like...

Code: Select all
LOAD INFILE ... feyd wrote:If possible, sure, but fgetcsv() works too.
Code: Select all
foreach($file as $val)
{
$fh = fopen($val['file_name'], 'r');
while (($data = fgetcsv($fh, 1000, ',', '"')) !== FALSE)
{
$vendor_sku = getAlnum($data[0]);
$available = (integer) getDigits($data[$val['stock_position']]);
// UPDATE products SET in_stock = (SELECT in_stock FROM products WHERE pSku = $vendor_sku) + $available
}
fclose($fh);
}EDIT: sigh... it seems no matter what I do, it exceeds allowed memory size.