Page 1 of 1

Adding data to existing row-column

Posted: Fri Sep 01, 2006 8:26 am
by anjanesh
I got this table

Code: Select all

mysql> describe `bin-data`;
+-------------+--------------+------+-----+---------+----------------+
| Field       | Type         | Null | Key | Default | Extra          |
+-------------+--------------+------+-----+---------+----------------+
| FileID      | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| Name        | varchar(255) | NO   |     | NULL    |                |
| Size        | bigint(20)   | NO   |     | NULL    |                |
| Data        | longblob     | NO   |     | NULL    |                |
+-------------+--------------+------+-----+---------+----------------+
5 rows in set (0.00 sec)

Code: Select all

<?php
error_reporting(E_ALL);
set_time_limit(3600);

mysql_connect('localhost', 'root', '');
mysql_select_db('test');

$Filename = 'A100MBFile.dat';

$sql = "INSERT INTO `bin-data` SET `Name` = 'Sample', `Size` = '".filesize($Filename)."'";
$res = mysql_query($sql) or die(mysql_error());
$id = mysql_insert_id();

$fh = fopen($Filename, "r");
while (!feof($fh))
 {
        $buffer = fread($fh, 1024); // 1KB
        mysql_query("UPDATE `bin-data` SET `Data` = CONCAT(`Data`, '".addslashes($buffer)."') WHERE `FileID` = $id");
 }
fclose($fh);
?>
Obviously, CONCAT is not good here as it returns the current data added to the buffer. Im looking for a solution that adds data using only 1KB at a time.

Is there a way ?

Thanks

Posted: Fri Sep 01, 2006 9:04 am
by onion2k
You can use CONCAT() in an update, it works fine.

Whether it works on a BLOB though, that's another question.

Posted: Fri Sep 01, 2006 9:19 am
by anjanesh
Does work for BLOB datatypes but Im looking for a solution that reads only 1KB for that UPDATE string.

CONCAT(`Data`, '".addslashes($buffer)."') returns (1KB + existing size of `Data`) which keeps increasing on the next UPDATE query. This is not ideal for storing huge files.

Posted: Fri Sep 01, 2006 9:50 am
by onion2k
Obviously if you concatenate the data in the database with the contents of $buffer you're going to get data.$buffer. That's what the concat() function is for.

If you want to get 1kb of data.$buffer then you'll need to get the contents of data, append $buffer, chop off whatever bit you don't want, and then update the database with the new contents. There won't be a simple PHP function for it.

Or, if you're using MySQL 5 you might be able to create a stored procedure to do it .. but that would be pretty complicated.

Posted: Fri Sep 01, 2006 5:20 pm
by feyd
Stored procedure? Overkill. The SUBSTRING() function.. and this thread really should be in Databases.

Posted: Fri Sep 01, 2006 7:03 pm
by anjanesh
Im not trying to retrieve data from the MySQL database, Im trying to insert a 100MB file in a LONGBLOB field by adding 1KB at a time instead of one big $data = fread(fopen($Filename, "r"), filesize($Filename)) which stores 100MB to $datastring.

Posted: Fri Sep 01, 2006 7:21 pm
by feyd
Then you'll need to create a loop to read out 1024 byte chunks of the file until you run out of the file.

Posted: Fri Sep 01, 2006 9:08 pm
by anjanesh
Loop is there.

Code: Select all

while (!feof($fh))
 {
        $buffer = fread($fh, 1024); // 1KB
        mysql_query("UPDATE `bin-data` SET `Data` = CONCAT(`Data`, '".addslashes($buffer)."') WHERE `FileID` = $id");
 }
But how do I keep adding 1024 byte chunks to the current existing `Data`.
Appending 1KB to an existing 1GB file wont have the overhead of reading the 1GB file first. Open for append mode and add 1KB data to it.

I want to do the same thing for a LONGBLOB field in a mysql database.

Posted: Fri Sep 01, 2006 9:13 pm
by feyd
Have you tried that code? It sure looks like it would work to me.

Posted: Fri Sep 01, 2006 10:06 pm
by anjanesh
Run 1:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 1KB + 0KB = 1KB
`Data` is now set to 1KB of data

Run 2:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 1KB + 1KB = 2KB
`Data` is now set to 2KB of data

Run 3:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 2KB + 1KB = 3KB
`Data` is now set to 3KB of data

Run 4:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 3KB + 1KB = 4KB
`Data` is now set to 4KB of data

Run 5:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 4KB + 1KB = 5KB
`Data` is now set to 5KB of data

Run n:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = (n-1)KB + 1KB = nKB
`Data` is now set to nKB of data


Need something like this :

Code: Select all

mysql_query("UPDATE `bin-data` ADD '".addslashes($buffer)."' TO `Data` WHERE `FileID` = $id");
- without assigning `Data` to something that returns more than 1KB

Posted: Fri Sep 01, 2006 10:48 pm
by feyd
The concat call doesn't assign the field to any other variable. There is no functionality you are thinking exists. You've already found the only manor in which this would work. Be happy.

Posted: Fri Sep 01, 2006 11:02 pm
by anjanesh
Wouldnt CONCAT (`a`, `b`) store the result temporarily somewhere, thus taking sizeof(`a`) + sizeof(`b`) of space ?

Posted: Fri Sep 01, 2006 11:07 pm
by feyd
That would depend on the optimization the database of interaction chooses. For MySQL or several other databases that would potentially involve looking at their internal code if you really want to know, definitively. I'm certainly not going to look in there for you.

The problem is, there is no alternative. You will need to use it if you continue with the implementation you've currently chosen.

Posted: Fri Sep 01, 2006 11:45 pm
by anjanesh
Im storing a 100MB+ file in the LONGBLOB field.
I wonder how cooltoad stores huge files.