Page 1 of 1

Upload script for CSV is there a size limitation

Posted: Mon Oct 07, 2002 8:55 am
by mccommunity
I have written a php script for uploading .csv files. This script works great until I get to a certain size limit (around 700K or so) I am wondering if there is some sort of php size limitation? Below is the script:

<?


$row = 1;
$fp = fopen($userfile,"r");
while ($data = fgetcsv ($fp,1000,",")){
$num = count ($data);
if ($num < 56){
$errors[] = "Row $row (run # $data[0]) is missing fields.";
}
if ($num > 56){
$errors[] = "Row $row (Item # $data[0]) had too many fields.";
}
$row++;
}
error_check();

/*
We do a check to ensure the numbers are unique
*/
rewind($fp);
while ($data = fgetcsv ($fp,1000,",")){
$test[] = $data[0];
}
$one = count($test);
$test = array_unique($test);
$two = count($test);

if ($one != $two){
$errors[] = "There were duplicate numbers found, data processing can not continue!";
}

error_check();

/*
Now we go back to the begining of the file and do individual data checks on each column
*/

/** NOTE: WE DO NO ERROR CHECKING ON THE FILE, ERROR CHECKING WILL BE DONE BY THE CLIENT SOFTWARE

rewind($fp);
$row = 1;

while ($data = fgetcsv ($fp,1000,",")){
$num = count($data);

$row++;
}
error_check();

** end of NOTE **/


// ok, mov :roll: :arrow: ing on, all data has checked out ok we loop through again and insert the data into the db
// grab the auction event passed in the form submission
$event=$HTTP_POST_VARS[event];

include_once("$CFG->libdir/dbdef.lib.php");
$c = pg_connect($manager);

rewind($fp);
$row = 1;
while ($data = fgetcsv ($fp,1000,",")){
$query="INSERT INTO data_inventory VALUES ($event";
$num_fields = count($data);
for ($x=0;$x < $num_fields;$x++){
$query .= ", '$data[$x]'";
}
$query .= ")";
$r = pg_exec($c,$query);
if ($r){
// echo "row $row loaded ok <br />";
} else {
$errors[] = "There was a problem uploading row $row";
}
$row++;
}

error_check();

?>