Adding data to existing row-column

Questions about the MySQL, PostgreSQL, and most other databases, as well as using it with PHP can be asked here.

Moderator: General Moderators

Post Reply
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Adding data to existing row-column

Post by anjanesh »

I got this table

Code: Select all

mysql> describe `bin-data`;
+-------------+--------------+------+-----+---------+----------------+
| Field       | Type         | Null | Key | Default | Extra          |
+-------------+--------------+------+-----+---------+----------------+
| FileID      | bigint(20)   | NO   | PRI | NULL    | auto_increment |
| Name        | varchar(255) | NO   |     | NULL    |                |
| Size        | bigint(20)   | NO   |     | NULL    |                |
| Data        | longblob     | NO   |     | NULL    |                |
+-------------+--------------+------+-----+---------+----------------+
5 rows in set (0.00 sec)

Code: Select all

<?php
error_reporting(E_ALL);
set_time_limit(3600);

mysql_connect('localhost', 'root', '');
mysql_select_db('test');

$Filename = 'A100MBFile.dat';

$sql = "INSERT INTO `bin-data` SET `Name` = 'Sample', `Size` = '".filesize($Filename)."'";
$res = mysql_query($sql) or die(mysql_error());
$id = mysql_insert_id();

$fh = fopen($Filename, "r");
while (!feof($fh))
 {
        $buffer = fread($fh, 1024); // 1KB
        mysql_query("UPDATE `bin-data` SET `Data` = CONCAT(`Data`, '".addslashes($buffer)."') WHERE `FileID` = $id");
 }
fclose($fh);
?>
Obviously, CONCAT is not good here as it returns the current data added to the buffer. Im looking for a solution that adds data using only 1KB at a time.

Is there a way ?

Thanks
User avatar
onion2k
Jedi Mod
Posts: 5263
Joined: Tue Dec 21, 2004 5:03 pm
Location: usrlab.com

Post by onion2k »

You can use CONCAT() in an update, it works fine.

Whether it works on a BLOB though, that's another question.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Does work for BLOB datatypes but Im looking for a solution that reads only 1KB for that UPDATE string.

CONCAT(`Data`, '".addslashes($buffer)."') returns (1KB + existing size of `Data`) which keeps increasing on the next UPDATE query. This is not ideal for storing huge files.
User avatar
onion2k
Jedi Mod
Posts: 5263
Joined: Tue Dec 21, 2004 5:03 pm
Location: usrlab.com

Post by onion2k »

Obviously if you concatenate the data in the database with the contents of $buffer you're going to get data.$buffer. That's what the concat() function is for.

If you want to get 1kb of data.$buffer then you'll need to get the contents of data, append $buffer, chop off whatever bit you don't want, and then update the database with the new contents. There won't be a simple PHP function for it.

Or, if you're using MySQL 5 you might be able to create a stored procedure to do it .. but that would be pretty complicated.
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

Stored procedure? Overkill. The SUBSTRING() function.. and this thread really should be in Databases.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Im not trying to retrieve data from the MySQL database, Im trying to insert a 100MB file in a LONGBLOB field by adding 1KB at a time instead of one big $data = fread(fopen($Filename, "r"), filesize($Filename)) which stores 100MB to $datastring.
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

Then you'll need to create a loop to read out 1024 byte chunks of the file until you run out of the file.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Loop is there.

Code: Select all

while (!feof($fh))
 {
        $buffer = fread($fh, 1024); // 1KB
        mysql_query("UPDATE `bin-data` SET `Data` = CONCAT(`Data`, '".addslashes($buffer)."') WHERE `FileID` = $id");
 }
But how do I keep adding 1024 byte chunks to the current existing `Data`.
Appending 1KB to an existing 1GB file wont have the overhead of reading the 1GB file first. Open for append mode and add 1KB data to it.

I want to do the same thing for a LONGBLOB field in a mysql database.
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

Have you tried that code? It sure looks like it would work to me.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Run 1:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 1KB + 0KB = 1KB
`Data` is now set to 1KB of data

Run 2:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 1KB + 1KB = 2KB
`Data` is now set to 2KB of data

Run 3:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 2KB + 1KB = 3KB
`Data` is now set to 3KB of data

Run 4:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 3KB + 1KB = 4KB
`Data` is now set to 4KB of data

Run 5:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = 4KB + 1KB = 5KB
`Data` is now set to 5KB of data

Run n:
Size of CONCAT(`Data`, '".addslashes($buffer)."') = (n-1)KB + 1KB = nKB
`Data` is now set to nKB of data


Need something like this :

Code: Select all

mysql_query("UPDATE `bin-data` ADD '".addslashes($buffer)."' TO `Data` WHERE `FileID` = $id");
- without assigning `Data` to something that returns more than 1KB
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

The concat call doesn't assign the field to any other variable. There is no functionality you are thinking exists. You've already found the only manor in which this would work. Be happy.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Wouldnt CONCAT (`a`, `b`) store the result temporarily somewhere, thus taking sizeof(`a`) + sizeof(`b`) of space ?
User avatar
feyd
Neighborhood Spidermoddy
Posts: 31559
Joined: Mon Mar 29, 2004 3:24 pm
Location: Bothell, Washington, USA

Post by feyd »

That would depend on the optimization the database of interaction chooses. For MySQL or several other databases that would potentially involve looking at their internal code if you really want to know, definitively. I'm certainly not going to look in there for you.

The problem is, there is no alternative. You will need to use it if you continue with the implementation you've currently chosen.
User avatar
anjanesh
DevNet Resident
Posts: 1679
Joined: Sat Dec 06, 2003 9:52 pm
Location: Mumbai, India

Post by anjanesh »

Im storing a 100MB+ file in the LONGBLOB field.
I wonder how cooltoad stores huge files.
Post Reply