avoid duplicate entries. database or aplication level?
Posted: Tue Oct 24, 2006 6:30 pm
I have this database of websites that i crawl periodicaly for content. Then i keep the urls in another table. Of course the urls must be unique in the table. Now, whats the best way of doing this?
seting the column as unique and insert everything and let mysql keep the column unique. Or perform a check before inserting?
the first on sounds faster, but is it really a good programming practice? i mean, does it have any issues on performance, security, integrity or bugs?
seting the column as unique and insert everything and let mysql keep the column unique. Or perform a check before inserting?
the first on sounds faster, but is it really a good programming practice? i mean, does it have any issues on performance, security, integrity or bugs?