I have this database of websites that i crawl periodicaly for content. Then i keep the urls in another table. Of course the urls must be unique in the table. Now, whats the best way of doing this?
seting the column as unique and insert everything and let mysql keep the column unique. Or perform a check before inserting?
the first on sounds faster, but is it really a good programming practice? i mean, does it have any issues on performance, security, integrity or bugs?
avoid duplicate entries. database or aplication level?
Moderator: General Moderators
avoid duplicate entries. database or aplication level?
Last edited by pedrotuga on Wed Oct 25, 2006 10:18 am, edited 1 time in total.
Re: avoind duplicate entries. database or aplication level?
definitely. Your application still can respond to the error.pedrotuga wrote:seting the column as unique and insert everything and let mysql keep the column unique.
You mean "SELECT then INSERT"? That's not atomic, you'd have to lock the table for the whole procedure or it's prone to race conditions.pedrotuga wrote:Or perform a check before inserting?
i think i will let the database take care of the integrity.
Thank you for the answer.
I am using mysql... i didnt even know that it suported locks.
BTW, this is a bit OT, but... is anybody using mysql transactions sucessfully? It's looks like all those new mysql features: triggers, transactions, procedures, etc are still in an early stage...
Thank you for the answer.
I am using mysql... i didnt even know that it suported locks.
BTW, this is a bit OT, but... is anybody using mysql transactions sucessfully? It's looks like all those new mysql features: triggers, transactions, procedures, etc are still in an early stage...