Page 1 of 1

avoid duplicate entries. database or aplication level?

Posted: Tue Oct 24, 2006 6:30 pm
by pedrotuga
I have this database of websites that i crawl periodicaly for content. Then i keep the urls in another table. Of course the urls must be unique in the table. Now, whats the best way of doing this?

seting the column as unique and insert everything and let mysql keep the column unique. Or perform a check before inserting?

the first on sounds faster, but is it really a good programming practice? i mean, does it have any issues on performance, security, integrity or bugs?

Re: avoind duplicate entries. database or aplication level?

Posted: Tue Oct 24, 2006 6:44 pm
by volka
pedrotuga wrote:seting the column as unique and insert everything and let mysql keep the column unique.
definitely. Your application still can respond to the error.
pedrotuga wrote:Or perform a check before inserting?
You mean "SELECT then INSERT"? That's not atomic, you'd have to lock the table for the whole procedure or it's prone to race conditions.

Posted: Wed Oct 25, 2006 1:14 pm
by pedrotuga
i think i will let the database take care of the integrity.
Thank you for the answer.

I am using mysql... i didnt even know that it suported locks.

BTW, this is a bit OT, but... is anybody using mysql transactions sucessfully? It's looks like all those new mysql features: triggers, transactions, procedures, etc are still in an early stage...