Page 1 of 1
Which is better in this case? DB or File?
Posted: Sat Sep 29, 2007 5:51 pm
by legend986
When I need to write some 3 million values which are an output from a particular test, which one would be better - A Database or a File? I mean, is it better to write to a database or a file?
Posted: Sat Sep 29, 2007 5:56 pm
by volka
Probably depends on what you're planning to do with the data.
Posted: Sat Sep 29, 2007 6:01 pm
by legend986
Well... Alright, I'll explain...
The input would be given to a php file. This input data contains atmost a million numbers. Which is better for this? db or a file?
The ouput would contain another number, the original number along with the response number. Which is better for this? db or file?
Posted: Sat Sep 29, 2007 6:12 pm
by volka
The script will always read all numbers sequentially?
Posted: Sat Sep 29, 2007 6:14 pm
by legend986
I'll explain in terms of the actual data.
Input all the IP Addresses into a Database whose structure is: id,ip,asn. In the beginning, asn is NULL or empty. So the iteration will take place in such a way that it checks the asn field of the corresponding ip. If it is empty then it will send this particular ip to a request server to get its asn. Once it gets the asn, it updates the database.
Posted: Sat Sep 29, 2007 6:23 pm
by volka
legend986 wrote:Once it gets the asn, it updates the database.
You've already made your choice.
And because you're searching for NULL fields and you're updating the data you've probably made the right decision.
Posted: Sat Sep 29, 2007 6:27 pm
by legend986
Ok.. Thank you. Then in that case, I will implement it in db itself.
One doubt however is this:
When there are many values, is there a way that I limit the number of items fetched and then refresh the page and then fetch the next cycle, so that the page won't face a timeout?
For example, I'm writing this code:
Code: Select all
$sql_ip = "SELECT * FROM ".$table_main." WHERE asn=NULL";
$result_ip = mysql_query($sql_ip);
while($row_ip = mysql_fetch_assoc($result_ip)) {
curl_setopt($ch, CURLOPT_POSTFIELDS,
"action=do_whois&family=ipv4&method_whois=whois&bulk_paste=".$row_ip['ip']."&submit_paste=Submit");
$result = curl_exec ($ch);
//At this stage, we posted an ip address and we got what we required.
$pattern = "/[1-9]{1,9}\s\s\s[|]\s[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}/";
preg_match($pattern,$result,$req_info);
$pattern_for_ASN = "/[1-9]{1,9}/";
preg_match($pattern_for_ASN,$req_info[0],$asn);
//And finally we update the corresponding ASN
}
The above code will work but the problem is that I think the script faces a timeout because of the data... Any workaround for this?
Posted: Sat Sep 29, 2007 6:29 pm
by volka
legend986 wrote:When there are many values, is there a way that I limit the number of items fetched and then refresh the page and then fetch the next cycle, so that the page won't face a timeout?
pagination is the keyword for your search.
Posted: Sat Sep 29, 2007 6:32 pm
by legend986
Just to clarify things before I start reading again, I would like to say that I do not want to display any items... I just want to fetch them and then operate upon them and then update the database. Is this still related to pagination?
Posted: Sat Sep 29, 2007 6:44 pm
by volka
and then refresh the page
yes, pagination.
Posted: Sat Sep 29, 2007 6:46 pm
by legend986
Ok Thank you... I will start that now...